Go IC Validation Overview

Source: Internet
Author: User

Validation is the process of ensuring that design and predetermined design expectations are consistent, and design expectations are usually defined by design specifications. For chip design, in different stages can be divided into: Register transfer level (RTL) functional verification, gate-level simulation verification, formal verification and timing verification. What we typically refer to as validation is RTL validation.

The verification work according to the design specification, the detailed design specification is the RTL code basis, is also the verification work basis. A complete, detailed design specification is an important starting point for validation work when the validation process finds that the response of the DUT is inconsistent with the expectations of the validation platform (Testbench), whether the DUT is faulty or testbench, according to the design specification. Once the verification engineer has obtained the design specification, it can carry out the verification work, the verification work can be divided into three stages: planning stage, implementation phase and analysis summary stage.

The main task of the planning phase is to fully understand the design specifications, extract the verification requirements according to the design criteria and write the verification plan. In the verification plan, we need to extract the verification requirement, develop the validation strategy, determine the validation language and validation methodology, validate the method, validate the platform hierarchy and architecture, plan the test case and the result checking mechanism and so on. In a word, all the things needed in the subsequent verification work can be reflected in the validation plan. For a tested design, the purpose of validation is to ensure that the design succeeds in accomplishing the expected tasks, accurately expressing the design specifications, and in addition to understanding the details of the design specification, the design boundary of the predetermined function needs to be clarified. For example, when we test a MP3 product, we cannot test whether we can make a call, but we generally test the broadcast function, that is, the function outside the boundary is not to be cared for.

The

        implementation phase is the process of validating the DUT according to the verification plan, including building the verification platform, creating test cases, developing simulation and statistical analysis scripts, running and debugging the use cases. The verification platform is mainly used for generating excitation, applying the excitation to the DUT, capturing the response and checking for correctness. At the same time, the platform code hierarchy, reusability, functional coverage, code simulation performance and automation to do some consideration. Validation typically uses a proprietary hardware validation language (HVL) and uses a validation methodology to create an upgradeable, predictable, reusable authentication environment. The current mainstream authentication methods are based on the Specman e-language erm, SystemVerilog-based VMM, OVM and UVM, especially in UVM, which cover a variety of advanced authentication techniques. It can improve the existing verification methods, and also take full advantage of the verification process automation, functional coverage, assert these features to establish a comprehensive general authentication environment. The validation process requires scripts to run simulations, perform results checks, data collection and analysis, help Debug and debug, and typically use scripting languages such as Makefile, Shell, Perl, Python, Tcl, and so on. While the front-end simulation tools generally use mentor Questsim,synopsys VCs and Cadence ius, each tool has its own characteristics, using either of them can be very good to complete the verification simulation work.

The analysis summary phase includes regression testing, coverage analysis, and output validation reports. Coverage measurement data has two important roles: one can clearly identify the part of the design that has not yet been fully validated to determine the vulnerability of the verification process. The adequacy of validation can be improved by supplementing specific direct test cases, or by changing the parameters of random test cases with constraints. On the other hand, coverage measurement data is an indicator that validation is sufficient enough to carry out the flow sheet. Coverage is divided into two main categories: Code coverage and feature coverage. Code coverage includes multiple forms (line overlay, rollover overlay, conditional override, state machine overlay, expression overlay, etc.), which is an automated process that the simulation tool can automatically collect and output reports. In a particular simulation run, coverage can reflect the execution of all RTL design description codes. Code coverage is a requirement, but not a sufficient condition. Functional coverage provides an external measurement method to determine how much of the design specification function points are correctly implemented. When the coverage measurement data reaches the target, it can indicate that the RTL verification work is over. Finally, the verification results, coverage data, and coverage analysis results should be output to the validation report and archived, and the validation report should contain the problems and solutions encountered throughout the validation process.

After the end of RTL verification, some other tasks such as back-end network table verification and auxiliary chip testing are required. In summary, the difficulty with RTL validation is how to generate all possible inputs to the DUT and determine whether the DUT output is correct or not. This requires the verification engineer to constantly understand the design specifications and translate into effective test cases, as early as possible to identify design defects and corrections, so that the validated modules and the entire chip can achieve 100% of the expected function.

Note: This article is "E Lesson Network" original, copyright "E lesson Net" all, welcome to share! If you want to reprint please reply "reprint". Please indicate the author "e Lesson Net" when reproduced.

Kris. IC Verification field experts, engaged in IC verification work for many years, has a wealth of validation theory and project practice. Chip design Verification Senior lecturer, National Nuclear self-instrument Enterprise Training (SVA), a number of IC research and development enterprise chip verification Consultant

Go IC Validation Overview

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.