Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > EDA/IP

OLA speakers explain new library approach

Posted: 14 Feb 2002 ?? ?Print Version ?Bookmark and Share

Keywords:open library api? ola? ibm? cell libraries?

Tracing the Open Library API initiative back to "a heritage at IBM in which it's unacceptable to fail," IBM Microelectronics vice president Bruce Beers opened the first OLA Conference with an explanation of OLA's roots and its aspirations. His talk was followed by technical sessions and a luncheon keynote from Silicon Metrics founder and CTO John Croix.

In his talk, Beers explained how the key concepts for the OLA began. The fundamental concept, he said, was a shift from component and cell libraries that contained only static data for delay and power estimation to dynamic libraries that included executable algorithms as well as data. He cited both enormous changes in underlying silicon technology and increasing challenges in design closure that, in IBM's view, rendered traditional delay and power approaches insufficient. "To really get access to a new technology node, a tool chain must be able to resolve design issues early and must avoid major iterations," Beers said. "A key requirement for achieving this is a method of getting consistent delay and power information into different tools at different stages in the design flow."

Technical papers from IBM, LSI Logic, Philips Semiconductors and Cadence Design Systems then documented OLA at a deeper level. IBM researcher John Beatty described the fundamental partitioning imposed by OLA: "Once, every point tool did its own network calculations. The key to OLA is that the libraries contain the technology information?whatever is necessary to provide a sufficiently accurate answer to the application tools?and the applications contain the design information." Beatty discussed this division of labor in practice at IBM, and described what the company had learned about managing performance in an OLA tool chain.

Dan Moritz, formerly of LSI Logic, described how LSI was driven toward the OLA model by their struggles to correlate tool vendors' timing engines with their own golden timing analysis, and by the explosive growth in the size of SDF files. "Placing the delay modeling within the libraries eliminated the problem of correlation," he said. "In effect, we embedded our timing analysis in the third-party tools."

Timothy Ehrler of Philips Semiconductors and Mark Hahn of Cadence described similar processes of moving from text-based timing files to the open APIs, claiming reduced file sizes, reduced guard banding and eliminated iterations as possible benefits.

At lunch, Croix documented, based on Silicon Metrics experience, that OLA-based tools can be either faster or slower than traditional tools in run-time. But the accuracy and consistency of results can be far superior, he said. That translates into reduced design iterations, and that reduces time to tapeout?a far more important metric than execution time.

He also emphasized the major shift in responsibility inherent in the OLA model: instead of tool vendors being responsible for modeling each new technology to estimate timing and power, that task now rests solely on the shoulders of the library vendors, who must embed the necessary algorithms in their libraries. This will significantly increase the need for computer science expertise among library vendors, he suggested. But the result of eliminating redundant parsing, error handling and data evaluation tasks from each tool involved in timing or power closure was well worth this major shift, he maintained.

? Ron Wilson

EE Times

Article Comments - OLA speakers explain new library app...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top