
Semiconductor IP News and Trends Blog
Chip Patterns Look Like Salvador Dali Painting
Nanoscale variability requires more simulations for statistical analysis. Tanner Design discusses this impact on traditional analog and RF IP designs.
This is a follow-on story (“Trends In Analog And RF IC Simulation”) in which Nicolas Williams, Tanner EDA’s Director of Product Management, adds to the ongoing discussion about trends in analog and RF chip design.
Blyler: What are the trends in analog and RF simulation?
Williams: The increased need to bring more layout dependent information into the front end design early on. Layout dependent effects influence performance, so it is no longer possible to separate “design” from “layout” phases, as we did traditionally. With nanoscale technologies, a multitude of physical device pattern separation dimensions must now be entered into the pre-layout simulation models to accurately predict post-layout circuit performance. This is more than just adding some stray capacitance to some nodes, but now includes accurate distances from gate to gate, gate to trench (SA,SB, etc.), distance in both X and Y dimensions between device active areas, distance from the gate contact to the channel edge (XGW), number of gate contacts (NGCON), distance to a single well edge (WPE), etc. Getting the pre-layout parameters accurately entered into the simulation will minimize re-design and re-layout resulting from performance deficiencies found during post-layout parameter extraction and design-verification simulations.
Another issue is larger variability at nanoscale, this is not so much due to manufacturing tolerance but really because of layout dependent effects. These effects include the ones listed above plus several that are not even modeled such as nearby and overlying metal stress modifying Vt and gm and poor lithography. The lithography challenges are so severe in deep-nanoscale that device patterns on final silicon look like they were drawn by Salvador Dali. Poor pattern shapes, increasing misalignment and shape-dependence on nearby patterns results in more gate length and width variation. More variability requires more complex simulations to have better confidence in your design. This requires faster simulators to simulate more corners or more Monte Carlo runs.

The lithography challenges are so severe in deep-nanoscale that device patterns on final silicon look like they were drawn by Salvador Dali.
Blyler: Statistical analysis, Design-of-Experiments, and corner modes – Digital designers already hear many of these terms from the yield experts in the foundries. Should they now expect to hear it from the analog and RF simulator communities?
Williams: Statistical analysis and corner models have always been part of analog and RF design but in the past it didn’t take much to try all combinations. There was no need to take a sample of the population when you could check the entire population. In nanoscale technologies, the number of effects that can affect circuit performance has grown exponentially to point where you have to take a statistical approach when checking corners. The older, alternative approach, of running the worst case combinations of all design corners from all effects would result in an overly pessimistic result. Also, when the number of Monte Carlo simulations required to statistically represent your circuit has grown too large, that is where ‘design-of-experiments’ comes into play using methods such as Hyper Cube sampling.
Simulation accuracy is limited by model accuracy. Statistical variation of devices and parameters are more richly specified than the traditional SPICE approach for Monte Carlo (where you had “lot” and “device” parameters). Now you have spatially correlated variations, and you have the much richer .variation blocks in SPICE. Foundry models are now “expected” to provide usable models at this level, which raises all kinds of foundry-proprietary concerns.
Blyler: How will this increase in statistical distribution analysis affect traditional analog electronic circuit simulators like Spice?
Williams: Statistical analysis requires a huge number of simulations, which can either take a long time to execute, or can be parallelized with CPU farms or cloud services, and smarter ways to sample which “corners” to run to get a reasonable confidence that you will be successful in silicon. Traditionally, aggregation of such results would have been a manual process, or at best some custom design flow development undertaken by the end user. Look for an upcoming sea change in how simulators are designed, sold and deployed by the EDA vendor community, to better address these needs.
All these simulations are great if your design meets all of its specifications. But what happens if it doesn’t? I feel the next step will be to use these simulations to figure out what variables your design is most sensitive to. Then you can try to mitigate the variability by improving the circuit or physical design (layout).
Blyler: Thank you.
This entry was posted in General and tagged analog, EDA, RF, simulation, statistical design, Tanner. Bookmark the permalink.
View all posts by John Blyler