Summary of experience in IC verification after the virgin Program

Source: Internet
Author: User
Tags synopsys

Complete and detailed design specifications are an important starting point for verification.

Verification is performed according to the design specification (specification). The detailed spec is RTL.CodeThe basis for writing is also the basis for verification. When the DUT response is found to be inconsistent with testbench's expectation during the verification process, you need to determine whether the DUT error or testbench error is based on spec.

Parameterized global definition

Global definition makes writing and simulation very convenient. In the process of writing testbench, if there are some repeated events, you can consider writing these statements into a task. For example:

1. The register correlation bit and its value can be defined in reg_define.v as a Global Macro.

2. The related paths can be defined as global Macros in define_board.v.

3. The display information of important system variables can be defined in display. v.

4. Compare tasks and Error Report tasks related to register can be written into tasks defined in reg_cmp.v.

5. Definition of clock period parameters. Generally, they are defined locally using parameter.

Use 'ifdefUse for global definition

1. the waveform source file is a VCD waveform, but it is too large for Power Consumption Analysis.

 1 $ Dumpfile ("wave. VCD ");//Open Database 
2 $ Dumpvars ( 1 , Top. U1 ); // Scope = top. u1, depth = 1
3 // The first parameter indicates the depth. If it is 0, all the depths are recorded. The second parameter indicates the scope. If it is omitted, the current scope of the table is recorded.
4 $ Dumpvars; // Depth = all scope = all
5 $ Dumpvars ( 0 ); // Depth = all scope = Current
6 $ Dumpvars ( 1 , Top. U1 ); // Depth = 1 Scope = top. u1
7 $ Dump0ff; // Pause record data changes, and do not write the changes to the database file
8 $ Dumpflush; // Restore record

2. The SHM waveform is Cadence and can be opened using simvision.

 1 $ Shm_open ( "  Waves. SHM  " ); //  Open waveform Database  
2 $ Shm_probe (top, " As " ); // Set probe on "TOP"
3 // A -- signals of the specific scrope
4 // S -- Ports of the specified scope and below, excluding library Cells
5 // C -- Ports of the specified scope and below, including library Cells
6 // As -- signals of the specified scope and below, excluding library Cells
7 // Ac -- signals of the specified scope and below, including library Cells
8 // There is also an M, which indicates the memories of the current scope, which can be used together with the above, such as "am" "AMS" AMC ". No ports are added to the current scope.
9 $ Shm_close // Close Database

3. The fsdb waveform is novas and can be opened with nwave.

1$ Fsdbdumpfile ("wave. fsdb ");//Open Database
2$ Fsdbdumpvars (0, Top. U1 );//Scope = top. u1, depth = 0

4. the VPD waveform is Synopsys and can be opened with DVE.

 
1$ Vcdplusfile ("wave. VPD ");//Open Database
2$ Vcdpluson (1, Top. U1 );//Scope = top. u1, depth = 1

5. variable access, using file I/O.

(1). Open the file

1 IntegerFile_id;
2File_id = fopen ("File_path/file_name");

(2). Write files

 1   //  $ Fmonitor always records changes  
2 $ Fmonitor (file_id, " % Format_char " , Parameter );
3
4 // $ Fwrite is recorded only when a trigger condition is required
5 $ Fwrite (file_id, " % Format_char " , Parameter );
6
7 // $ Fdisplay is recorded only when a trigger condition is required
8 $ Fdisplay (file_id," % Format_char " , Parameter );

(3). Read files

 
1 IntegerFile_id;
2File_id = $ fread ("File_path/file_name","R");

(4) close the file

 
$ Fclose (fjile_id );

(5). Set the initial memory values by the file

1$ Readmemh ("File_name", Memory_name"); // The initial data is in hexadecimal format.
2$ Readmemb ("File_name", Memory_name"); // The initialization data is binary.

(6). You can also use macros to select the access and access time of variables.

1'Ifdef save_lrout
2Start_save =1'B1;
3# (10e6) stop_save =1'B1;
4'Endif
5Xxx = $ fopen ("XXX", "W ");
6If(Start_save &&! Stop_save)
7$ Fwrite (XXX, "% F \ n", X );
8$ Fclose;

Test Case, Case

1. case itself should be modularized as much as possible.

2. case should be an automatic and self-check case, which can automatically report errors to save test time.

3. Coverage Rate: coverage rate includes functional coverage rate, code coverage rate, and coverage rate of some manually added coverage points. It provides statistical information about simulation, including the structure and transfer, and how it goes through. You can decide which parts of the design are not simulated to know the weaknesses in the verification. Code coverage is the easiest way to achieve 100%, but if case default is used in the OpenGL code, it is difficult to achieve 100% coverage. Function coverage is the function of some functions, as well as the State coverage of the state machine. Then there is the coverage point added by the Verification engineer. Generally, you need to use these items to complete the report after the verification is completed.

4. The main simulation thread is often imitated by the initial statement initial, which contains a series of blocking expressions.

5. I personally think that writing a case is not very important, but it is important that the test points in your case are comprehensive and the test items are incomplete.

6. In case, provide random excitation signals as much as possible to increase the test space for verification. This maximizes the functional space covered by verification. The random here is generally constrained random, rather than the random in the general sense.

7. compiling case can be divided into the following three steps: From specification to features, from features to testcase, from testcase to testbenches

(1 ). the first step in case writing is to identify the features to be verified. Different feature levels are suitable for verification, and some are suitable for component (Unit/reusable/ASIC) level, and some must be system-level. The component-level feature is completely included in the component to be verified. Therefore, the verification is independent of other modules of the system and can be performed independently. System-level features involves the interaction between multiple units of the system. System-level features should not be multiple. The features that can be verified in component-level should not be defined as system-level features.

(2) Before forming testcase, you must first classify features:

Must-have (required): designed to work properly or meet market needs, this is the main content of first-time success, thorough verification should be performed under various conditions.

Shoshould-have: it is mainly used to expand the performance of the design or is different from that of competitors. You only need to verify the basic functions. If you have time and resources, you can perform further verification;

Nice-to-have (BEST): as an option for design implementation, if time permits, it can be verified once. Generally, verification is not performed.

Based on features prioritize, you can avoid missing must-have features when adjusting the verification plan.

8. Details of case verification also need to be divided. For example, some cases can be directly used for self-check or waveform, while some cases involve DSP performance indicators (such as signal-to-noise ratio, spectrum, and degree of separation ), the validator needs to export the output data to MATLAB for more specific analysis.

System incentives

1. Use MATLAB to generate normalized data, and use readmemb/readmemh to read the data into the OpenGL simulation.You can also read the simulation data into MATLAB to analyze relevant characteristics.

2. The clock and reset of testbench should be imitated at the global level. Initialize the clock and reset of testbench with a non-blocking value assignment, and update them with a blocking value assignment.

 1 'Timescale 1ns/1ns
2 'Define Period 5 // 100 MHz clock
3 Initial Begin
4 CLK <= 0 ;
5 Forever # ('Period) CLK = ~ CLK;
6 End
7 Initial Begin
8 Rst_n <= 0 ;
9 @( Negedge CLK) rst_n = 1 ;
10 End

3. Time Scale 'timescale: this parameter is selected based on the simulation accuracy and running time balance.

4. Bus function model BFM: provides means for interfaces defined in the simulation model. That is to say, the designer can verify a set of time series or protocols without simulating the low-level model of the entire device.

Add a copy file

1. the designed testbench allows convenient transplantation to door-level simulation, the main change of the test bench with a door-level module is the removal of comprehensive RTL files and the addition of a network table with support for database and time series information.

Compared with the previous simulation, the changes in the simulation environment caused by changes in the subsequent simulation test objects are mainly reflected in two points:

(1) pin connection: the path defined by the pins in the RTL code is the same as that defined in the logical table. However, for the same logic, the pin naming method of the Logical Network table after the RTL code is integrated is somewhat different from that of the pins in the RTL code.

(2). Call the SDF file: the simulation tool of cadence, NC-OpenGL, can both perform RTL simulation and door-level simulation. It provides the task for parsing the SDF and calls it at the beginning of simulation.

 1 'Ifdef gate_sim
2 $ Sdf_annotate ( " Sdf_file "
3 {, Module_instance}
4 {, " Config_file " }
5 {, " LOG_FILE " }
6 {, " Mtm_spec " }
7 {, " Scale_factors " }
8 {, " Scale_type " });

2. The designer must consider the conflict between the maintenance time and the establishment time when the two extremes are fast and slow, respectively. In contrast, the SDF files that the validators follow are slow_sdf, fast_sdf, and typical.

Verification language and Verification Method

1. If you don't talk about it, it's basic. The requirement is that the error location can be located during debugging.

2. As the core of verification, the verification environments used by various companies are almost identical.

3. C is generally used to write stimulus. Generally, SOC has at least one CPU core, and CProgramTo run the simulation.

4. As for the scripting language, because Linux/Unix is basically a command line operation, the script will greatly improve your work efficiency and therefore must be mastered. Generally, scripts are executed after the verification environment is set up, such as related running commands and batch processing commands. Perl, Shell, TCL, and so on can be learned at least one.

5. Common verification platforms include vmm and OVM, and UVM will be available in the future. Of course, the main force in the market is still vmm, but because OVM is open-source, OVM development is also very fast. Vmm is dominated by Synopsys, and OVM is developed by Cadence and mentor.

6. assert is a good thing. Assert is very powerful and easy to use. It is precisely because of its advantages that it can discover design errors and locate problems accurately, so the verification engineer cannot use it very easily, because the verification engineer generally can verify the design module without having to know too much design details, but assert needs to have a clear understanding of the internal signal, to connect the internal signal to the corresponding assert. It is recommended that IC design engineers learn.

Nowadays, hierarchical design is generally used, and the corresponding verification work should adopt the hierarchical verification method.

Verification layers are generally divided as follows:

-Unit-level verification (Functional Unit verification)

-Reusable components verification (reusable Unit verification)

-ASIC and FPGA verification (ASIC and FPGA verification)

-System-level verification (System Level Verification)

-Board-level verification (board-level verification)

The features and verification methods of each verification layer are described as follows:

1.Unit-level verification(Functional Unit verification)The design unit division is a logical division. As the design goes deeper, the functions and interfaces of the design unit will change greatly. Therefore, the design unit verification is generally verified by the designers themselves, the purpose of verification is to ensure that the RTL code of the design unit has no syntax errors and can implement basic functions. You do not need to consider code coverage and recursive testing. For large designs, each design unit requires a dedicated verification environment, which takes a lot of time to generate incentives and check responses. In addition, every Design Unit writes testbench, and the workload is huge, therefore, it is impossible to perform a formal verification process. Therefore, Design Unit verification generally adopts the ad-hoc (special) form. However, the integration of design unit needs to be verified on ASIC or FPGA-level. For complex ASIC designs, there may be complicated design units. The verification of this design unit requires strong visibility and controllability, and the functions related to this design unit are verified as much as possible.

2.Reusable components Verification(Reusable Unit verification)A reusable unit is an independent design component that has nothing to do with specific applications. It has a standard external interface and its testbench is reusable. Recursive verification should be performed on modified reusable units to ensure backward compatibility of the design. If the design function is modified, formal verification will not work. When designing reusable units, the verification process should be documented to obtain users' trust in reusable units;

3.ASIC and FPGA Verification(ASICAndFPGAVerification)ASIC and FPGA are physically divided. Their interfaces and functions are defined during preliminary design and won't be changed much. In this case, black-box verification can be performed. ASIC verification can be used as system verification for complex ASIC chips;

4.System-level verification(System-level verification)System is a logical division consisting of independently verified components. System-level verification mainly verifies the relationship between design units, the functions of Design Unit have been verified at unit-level or ASIC-level. To reduce the simulation iteration, we try to exclude the design unit not required by testcase for the defined testcase so that the system size is as small as possible.

5. board-level verification (board-level verification) using the board-level model generated by the board-level design tool, the physical implementation of the design is consistent with the board-level simulation, which is different from the system logic model. The component model of the board-level model can be replaced by a third party or hardware Modeler. During verification, the physical parameters of the board-level model should be simulated to ensure functional correctness. Board-level connectivity is verified in the form of comparison between the description of the component pin connection and the network table generated by the board-level design tool. During verification, you must first define the level of granularity for verification, such as system level and unit level. At the same time, you must determine the level of testcase, determine the testcase of back-box or white-Box Based on the grasp of the design implementation, that is, the Abstraction Level of testcase, and then determine the output result of the incentive and the check method of the output result.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.