Chip computing power test scheme
1. Software implementation
according to the data comparison requirements of input excitation and output response of the "core of electricity", a comprehensive Verilog code is written. The design of the code is completely in accordance with the timing requirements of "the core of electricity"
according to the design idea of establishing test platform based on programmable devices, the construction method of function test platform is as follows: using programmable logic devices to generate input excitation and process output response; ROM is used to store DSP core program, control register parameters, pulse compression coefficient and filter coefficient; SRAM is used as off chip cache
2, hardware implementation
according to the block diagram of the function test platform, the schematic diagram and PCB are designed. Finally, a system platform for the function test of the "core of electricity" is designed
extended data:
classification of programmable logic devices:
1. Circuits in fixed logic devices are permanent, and they perform one or a group of functions once they are manufactured, they cannot be changed
Programmable logic devices (PLDs) are standard finished procts that can provide customers with a wide range of logic capabilities, characteristics, speed and voltage characteristics - and these devices can be changed at any time to complete many different functionsIn addition, the test software is also facing new test problems caused by the continuous improvement of deep submicron technology and frequency. The ATPG test mode used to test static blocking faults is no longer applicable, but it is difficult to find new faults by adding function mode to traditional tools. A better way is to classify the past functional pattern groups to determine which faults cannot be detected, and then create ATPG patterns to capture these missing fault types
with the increase of design capacity and the decrease of test time for each transistor, in order to find the speed related problems and verify the circuit timing, synchronous test method must be adopted. Synchronous testing must combine a variety of fault models, including transient model, path delay and IDDQ
some companies in the instry believe that the combination of blocking failure, functional failure and transient / path delay failure may be the most effective testing strategy. For deep submicron chip and high frequency operation mode, transient and path delay test is more important
in order to solve the ate accuracy problem and rece the cost, a new method must be found, which can simplify the interface of the test device (transient and path delay test requires the clock at the interface of the test device to be accurate), and ensure the signal has enough accuracy ring the test
e to the possibility of manufacturing defects in SoC memory blocks, memory BIST must have the diagnostic function. Once a problem is found, the defective address unit can be mapped to the rendant memory of the standby address unit, and the detected fault address will be discarded to avoid abandoning the whole expensive chip
no additional gate circuit or control logic is needed to test small embedded memory blocks. For example, the vector transformation test technology can transform the functional mode into a series of scan modes
unlike BIST method, the function input of bypass memory block does not need additional logic circuit. Since no additional test logic is needed, SoC development engineers can reuse the test patterns formed in the past
Advanced ATPG tools can not only test macros in parallel, but also determine whether there are conflicts, and specify which macros can be tested in parallel and why not. In addition, even if the macro clock is the same as the scan clock (such as synchronous memory), these macros can be effectively tested
explain why there are still more good chips than bad chips after the first step:
suppose there are m good chips and N bad chips (M & gt; n)
there are only three cases of pairwise comparison: good comparison, good comparison, bad comparison
if there is a good chip and a bad chip, then the rest of the chips are good or bad comparison:
the result of good and bad comparison is either good or bad. It is certain that all of the 2A chips have been discarded
if half of the chips are discarded, the good chips are left The number of remainders: M & # 39= M-A) / 2
at least half of the chips in the bad to bad comparison are discarded (according to the rule, either one or two chips are discarded in the comparison of two chips)
so the number of bad chips remaining is n & # 39& lt;=( n-a)/2
m'=( m-a)/2>( n-a)/2>= n' That is M & # 39& gt; n' So after comparison, there are still more good chips than bad ones
what is unclear about LZ?