Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > EDA/IP

Researchers call for fundamental shift in timing analysis

Posted: 09 Dec 2002 ?? ?Print Version ?Bookmark and Share

Keywords:ic design? interconnect? digital circuits? acm/ieee? Tau workshop?

Calling for a new approach to the design of digital circuits, researchers at the ACM/IEEE Tau workshop presented strong arguments for a move to statistical, or "probabilistic," timing analysis. Such an approach would mark a significant departure from the static timing analysis that underlies today's IC design flow.

Presenters said that a statistical approach is the only way to accurately account for device, interconnect and process variations. The technique, they said, can reduce the excessive number of analysis runs required for timing closure, and is more accurate. But they acknowledged that the tools are not here yet, and that some technical problems remain.

"It is not just that we need to tune up static timing analysis," said Kurt Keutzer, professor of electrical engineering and computer science at the University of California at Berkeley. "I think we need to fundamentally rethink the way we design circuits." Keutzer urged designers to envision chips not as deterministic devices, but as "stochastic computing mechanisms."

"I believe the era of probabilistic design is here, and that deterministic design is gone," said Chandu Visweswariah, research staff member at IBM Corp.'s Thomas J. Watson Research Center. He called for a statistical approach to modeling and methodology as well as analysis.

But Avi Efrati, system architect for performance verification at Intel Israel, sounded a note of caution. "Static timing analysis is a key component of chip design," Efrati said. "It will evolve to support other things, but I am not sure a revolution is coming so quickly."

Fixed vs. random

Static timing analysis today is deterministic, meaning that the analysis uses fixed delays for gates and wires and does not consider statistical variations in the underlying silicon. In the current methodology, best-case, worst-case, and nominal parameter sets are constructed using Spice simulation. The timing analyzer then runs several times to report the resulting numbers.

A statistical approach would use random variables, not fixed delays. And rather than lists of best- and worst-case numbers, it would produce a statistical distribution. It could, for instance, tell the designer that 50 percent of his or her circuits will run at 225MHz. In this sense, yield and timing prediction become pretty much the same thing.

While static timing analysis handles die-to-die variations well, it cannot accurately model variations within a single die, researchers said. Process variations are perhaps the most obvious problem, but there is also a "fundamental randomness" in the behavior of silicon structures, said Keutzer.

For example, Keutzer noted, gates vary according to width, length, threshold voltages, oxides and doping. Interconnects vary according to line width, metal thickness, and interlayer dielectric thickness. And parametric delay faults occur due to random particles landing on chips.

"If you try to sweep all these variations under the rug of worst-case models, the result will be overly conservative," he said. "Worse yet, it is not reliable. Static timing analysis cannot catch violations due to uncorrelated path delays." Static timing analysis assumes gate and wire delays are perfectly correlated across a chip, which is not the case, he said.

As a result, Keutzer said, it may take six weeks to close timing on a 200MHz ASIC at 0.13?m design rules - and then, when the chip comes back, designers might find it actually runs at 250MHz.

But Keutzer noted that there are some technical challenges to be resolved with static timing analysis. One is economical delay testing. "The report says that 50 percent of my chips will run at 225MHz, but which 50 percent?" he asked. "You've got to test each chip at speed."

At IBM, Visweswariah said, designers want to predict parametric or "circuit-limited" yield loss. What's needed is a combination of statistical timing, yield prediction, design centering and design-for-manufacturability techniques, he said. Visweswariah used charts and graphs to show that worst-case static timing analysis is "quirky at best" when it comes to shedding light on such comparisons as yield vs. slack.

Static timing analysis, he noted, takes in netlists, assertions, delays and slew models, and produces reports on slack and diagnostics. A statistical analysis would also take information on sources of variability, and provide a yield curve along with diagnostics, he said.

ASIC designers do not want a huge methodology change, but they will warm to statistical techniques once it is shown that they can substantially reduce timing runs, Visweswariah said.

Intel's Efrati spoke of other concerns. He said Intel wants better modeling for the impact of crosstalk on noise and timing, with an approach that allows a trade-off between accuracy and run-time. He said timing analysis needs to consider cells, devices and abstract models in the same run, with a single timing graph.

Efrati also called for better support for multiple-input switching, and for the sleep transistors used for leakage control. "Crosstalk, multiple-input switching and variability all need better solutions," he said. "We will require more dynamic, device-level analysis in timing tools."

Getting a grip on process

In a session on process variations, papers called for various kinds of statistical or probabilistic analysis. Lou Scheffer, fellow at Cadence Design Systems Inc., presented an approach that computes performance as a function of process variation. It considers interchip variations as well as intrachip deterministic and statistical variations.

A paper from Ghent University in Belgium outlined a probabilistic approach to clock cycle prediction that captures the impact of parallelism. It uses system-level interconnect prediction techniques, a topic of growing interest in the research community.

One of several papers from the University of Michigan, in cooperation with Motorola Inc., described a new method for path-based statistical timing analysis. Another outlined an approach for evaluating worst-case skew in light of power supply variations. A third paper described a statistical timing technique that uses bounds and selective enumeration in an attempt to reduce the exponential run-times that have plagued some statistical techniques.

"Probabilistic design is not new," noted IBM's Visweswariah. "Analog designers have been doing it for years. It now applies to digital, and statistical considerations must influence all stages of design. We need to encompass methodology, modeling, analysis, synthesis and test."

The annual Tau workshop focuses on timing issues in the specification and synthesis of digital systems. In addition to statistical timing analysis and process variations, the workshop considered crosstalk, optimization, and wireless interconnects.

- Richard Goering

EE Times

Article Comments - Researchers call for fundamental shi...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top