Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > EDA/IP
?
?
EDA/IP??

Hope seen for taming IC process variability at 65nm

Posted: 01 Jun 2006 ?? ?Print Version ?Bookmark and Share

Keywords:Richard Goering? chip design? International Symposium on Physical Design? 65nm? IC process?

Intensive CAD research by universities and semiconductor manufacturers is yielding new solutions for grappling with IC process variability. Presentations at the recent International Symposium on Physical Design (ISPD) offered new hope for chip design at 65nm and below, where temperature, voltage and process variations can have a dramatic impact on chip timing, manufacturability and yield.

Design tools are just beginning to address variability, and statistical process data is generally not available from foundries. One paper proposed a way to close this foundry-to-designer gap. The paper showed how fabs can model process variations based on spatial correlations, providing measured data that can be used by tools such as statistical timing analyzers. Listing authors from UCLA and IBM Research, it received the ISPD 2006 "best paper" award.

Process variability was the largest single theme at the conference, said ISPD 2006 general chair Lou Scheffer. "As processes shrink, variability is getting to be a concern for everyone," said Scheffer, a Cadence Design Systems Inc. fellow.

"Variability is a first-class design concern," said keynote speaker Ted Vucurevich, CTO at Cadence. He noted that gate oxides are so thin that a change of one atom can cause a 25 percent difference in substrate current. Further, Vucurevich said, the modes in which a device operates have become a source of variability. A cellphone chip exhibits different "hot spots" depending on whether it's taking a call, playing a video or displaying pictures.

Handling the challenges, Vucurevich said, will require next-generation EDA architecture. Contemporary CAD architectures, he said, are still sequential in nature, with synthesis, placement and routing as distinct steps. What's needed is a "peer to peer" approach in which physical, electrical and logical perspectives can be presented all at once. To create it, concurrent applications must interact in shared memory and "collaborate at high performance."

Spatial correlation
In presenting the ISPD "best paper" on extraction of spatial correlation, UCLA PhD student Jinjun Xiong said process variations in nanometer manufacturing can have a huge impact on design optimization and signoff. He cited previous work showing that process variations can affect timing yield by 20 percent and leakage power yield by 25 percentthey can account for a 20 percent difference in area and 17 percent difference in power in circuit tuning.

The UCLA work focuses on intradie spatial variations, in which the same devices on the same die may be manufactured differently. Spatial correlation is important, Xiong said, because the closer the devices are located, the higher the probability that they are similar. Earlier work has suggested that failure to take spatial correlation into account can make a 30 percent difference in timing, and that spatial variation may be 40 percent to 65 percent of total variation.

Statistical timing-analysis techniques assume the required spatially correlated information is known a priori. But the only accurate way to get this data is to extract it from silicon measurements, the authors contend. "There's a missing link, and that's a technique to extract a valid spatial correlation model," Xiong said. "That missing link is our work."

The paper outlines a "robust" spatial correlation extraction technique that considers unavoidable measurement noise. Experimental results based on Monte Carlo analysis claim fewer than 10 percent errors for the extracted process variations. "EDA tools need data," Cadence's Scheffer said. "Fabs have data, but no models. The idea of extracting a usable model from data the foundry already has is a big contribution."

A second UCLA paper described an algorithm for buffer insertion that considers process variations. Compared with the traditional deterministic approach, it claims to improve timing yield by 15 percent.

Smoothing distributions
Many consider statistical timing analysis the next big step in IC physical design, but it has some limitations. Existing algorithms generally assume that process parameters have smooth Gaussian distributions and that variable operations are linear. Real silicon isn't that way.

Charlie Cheng, professor of electrical and computer engineering at the University of Wisconsin-Madison, proposed a quadratic way to model non-Gaussian parameters and a systematic way to analyze "confidence intervals" for the statistical distribution of parameters. A 97 percent confidence interval, for example, indicates a 97 percent probability that the "true mean" value will fall inside the distribution curve. A parameter may be non-Gaussian, but the mean estimation can be close to Gaussian.

Ann Gattiker, a research staff member at IBM Research, said the ability to detect defects before parts ship is a crucial capability that's often overlooked. She presented the notion of ship-product quality level (SPQL), a metric that shows the ratio of bad parts to total parts shipped.

"Improving yield improves SPQL, but doesn't get you all the way there. We need to consider the detectability of failure mechanisms at test. Work with us to make things testable," Gattiker told the assembled CAD researchers.

- Richard Goering
EE Times




Article Comments - Hope seen for taming IC process vari...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top