Semiconductor IP News and Trends Blog
Chip Testing Continuum Gets New Voice
Former EDA industry expert makes the case for pre-silicon testing using post-silicon tools. What part will IP play? Will design and test languages be a problem?
Most engineers typically think of traditional test equipment as stand alone bulky oscilloscopes, digital-voltage-meters, logic analyzers and the like. But the trend over the last several decades has been toward modular test systems based on software. Today, modular boxes are used to perform every test function imaginable, from signal generator, spectrum analyzer, digital data source, to buss monitoring and control. These modules are connected together in a buss oriented backplane, controlled by software on the front-end.
Why is this important to the semiconductor chip development process? The reason is that the software driven nature of modular instrumentation allows you to build very complex test sequences. Such sequences are needed to match the complexity of today’s designs. Chip test, validation and verification activities require the generation of many kinds of signal patterns, from which samples will be obtained and analyzed. The software aspect of these complex test platforms means that this task can be automated via test scripts.
Automated test scripts allow for reuse in both the testing and production phase of the chip development process. Since it is all being driven through software, engineers can capture the test results, roll that test data up and have good visibility as to what is passing and failing in the lab and out to production test
Most everyone will agree that leading edge system-on-chip (SoC) designs continue to grow in complexity. It is not surprising that SoC validation has also become more complex to validate the behavior of that design.
Consider the challenge in building a test environment for a modern mobile SoC that contains six radio subsystem, e.g., one for LTE, WiFi, Bluetooth, Near Field Communications (NFC), and GPSs (see Figure 1). In addition, there will be a variety of general-purpose inputs and outputs, digital camera interfaces, UARTS, intelligent power supplies and more. With the reality of shrinking time-to-market windows, the behavior of all of these interacting subsystems must be validated before you implement everything into silicon.
Traditionally, test equipment vendors like National Instruments (NI) have been used only for post silicon testing, i.e., after the first article chip has come back from the foundry. At that point, the chip is placed into a physical test fixture to validate the behavior of the digital, analog, RF, buss and other portions of the chip.
But developers are starting to see the applicability of the post-silicon test and verification platform much earlier on in the development cycle. The argument is that if test cases could be created earlier on, then it would be easier to test chips in the traditional manner – namely, once the first spin of silicon occurs. Further, such a test platform would be very useful for high-volume, postproduction. To understand the specifics of this proposed idea, it’s important to understand the existing process.
In a typical development process, chip architects start with high-level design algorithms using a flow diagram or with a language based C-model. That model is refined until the design has been decomposed to the component level. This process corresponds to the path along the left-hand side of the “V” development model (see Figure 2). (Reference: “Blacker Boxes Lie Ahead” )
After the components – partitioned into hardware and software – have been created, they get integrated into subsystems. Each of these subsystems is tested and the integration-test process continues until the system is completely built. Once fully tested, validated and verified, the system is ready for production.
Naturally, there are many different tools used throughout this flow,” notes George Zafiropoulos – formerly with Synopsys who now serves as VP of Solutions Marketing at National Instruments (NI)-AWR. “On the design side (left-hand side of the “V” diagram), these tools consist of high-level models and logic-analog simulation tools through in-circuit emulators and FPGA-based prototyping platforms. On the test and integration side (right-hand side of “V” diagram), the tools are focused on subsystem and full system test and verification, including pin-level chip tests. NI plays a large role in these systems.”
Therein lies the crux of the problem, namely, that the existing process requires the use of different test, validation and verification tools throughout the chip development process. Why not bring some of those tasks required in post silicon testing forward into pre-silicon validation?
“This is really the big concept,” explains Zafiropoulos. “Why not have a continuum of testing.” Don’t wait to test until the first tape-out when you have a whole different set of tools from the earlier validation activities. Instead, try modeling the design in an emulation system like Cadence’s Palladium. This will get the design up and running. Then you could create a test harness around it using a environment like Labview. When the first article silicon comes back from the foundry, you’ll be able to plug the actual chip into the exact same fixture and test the real chip just like you tested the emulated design. “
This approach will allow developers to share verification intellectual property (VIP) across the development process both in the design and test phases, i.e., throughout both sides of the “V” process. This approach also provides a logical path for the transition between these two major activities.
Here’s one way this approach might work. For the early design of a chip, there will be a software model of the test environment, i.e., a test bench. Further, there will be a software model of the Design-Under-Test (DUT), perhaps a logic simulator connected to a behavioral model of the test bench. The next step would be to move that DUT model into a hardware platform like an emulator that will ensure a better performance while keeping the test environment that was developed earlier in software. The next logical step would be to replace the software test bench with a piece of hardware stimulus to then drive electrical signals into the DUT. That way, when the chip is fabricated, it can be plugged into the same hardware test wrapper as before. Now the same test cases can be run on the actual, physical silicon. Taken together, this would be a very natural migration, observed Zafiropoulos.
What is preventing this migration from happening? It’s not the technology. Rather, one of the biggest challenges is that large organizations are separated the various silos for the test, validation and verification of chips. This means that the decision for the chip’s design, test and validation methodologies come from very different and siloed departments. Overcoming these silos won’t be easy but it will lead to a much more efficient testing continuum from architecture design to production.
The confirmation of the siloed development mindset was highlighted by one question echoed by many in the audience at the conclusion of Zafiropoulos’s presentation. Most of the attendees where chip designers well versed with pre-silicon tools but unfamiliar with most post-silicon testers. For example, verification languages were an issue for those who had never used the Mindstorm or other NI graphic tools suits. Zafiropoulos noted that most chip designers create models in VHDL software that are then implemented in chips or FPGAs. The NI testing tools can use high-level tools like Mathlab and languages like C to drive the instrumentation test systems.
Further, more direct ties to VHDL are being developed at NI.
Such tool questions were why this NI proposal was presented at DAC. While combine traditional testing tools with chip development is not a new idea (remember Agilent and Cadence RF suites?), it is starting to take place in new and different ways with NI. Look for more progress in the near future.