Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > EDA/IP
?
?
EDA/IP??

The need to design for uncertainty

Posted: 01 Jul 2005 ?? ?Print Version ?Bookmark and Share

Keywords:design for variability? ispd? edp? uncertainty? optical proximity correction?

If there was a message for chip designers from two recent industry conferences, it was this: Get used to uncertainty. The era of "design for variability" is here.

At both the International Symposium on Physical Design (ISPD) and the Electronic Design Processes (EDP) conferences, speakers noted the growing impact of process, temperature and voltage variations on IC designs. And they weren't just talking about DFM, which is the label that usually gets stuck on things when we start talking about variability.

The problem with the DFM moniker is that it implies designers are supposed to do something for the manufacturing people, such as put optical proximity correction (opc) into chip layouts. Design-for-variability is certainly aimed at producing manufacturable designs, but it could also apply to the chip designer who just wants to find out how gate-width variations might affect his leakage current (which, as it turns out, is a problem right now at 130nm and 90nm).

The other topic that usually comes up in connection with variability is statistical timing analysis, but that's just one of a number of approaches for dealing with variability. And it's not yet clear when the necessary statistical models will be availableor who will need statistical analysis at what process nodes.

Sources of uncertainty, or variability, in chip designs are many. As noted at EDP, process variations that affect design include critical dimensions, channel width, interconnect and voltage thresholds. Supply voltage and clock skew variations are also growing more significant. And what about temperature? Not many people think about that, but one paper at EDP showed how a thermal gradient of 10C can change timing delays by 30 percent.

One way to think about design and process variability was outlined in an ISPD presentation. That talk distinguished between operational or global variability, which stems from the different operating modes in which a product might run, and local variability. The latter category includes interconnect variations in thickness and width, as well as cell variations according to process, temperature and voltage fluctuations. On-chip variations also affect cell variations.

On the manufacturing side, speakers distinguished between random variations, such as those caused by particle defects, and systemic variations, such as shorts or opens caused by printing misalignment. Although often lumped together, they require different approaches.

Chip designers can manage some of the variability with min/max corner analysis, while other concerns may require a statistical approach. With large fluctuations, temperature and voltage will probably require a corner-case analysis, one speaker said.

But the argument for looking at timing, and ultimately power, in terms of statistical probability distributions is strong. There are only so many corners one can model.

When chip design gets into the realm of the very small, we must accept uncertainty and start thinking in terms of probabilities. It's a new way of thinking that will reshape the IC nanometer design and manufacturing flow.

- Richard Goering

EE Times





Article Comments - The need to design for uncertainty
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top