Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > EDA/IP
?
?
EDA/IP??

Rhines on EDA: End 'endless verification'

Posted: 26 Feb 2008 ?? ?Print Version ?Bookmark and Share

Keywords:Walden Rhines? EDA industry? transaction-level modeling?

If it hopes to cut the cost of design verification, the EDA industry must embrace a combination of formal methods, transaction-level modeling (TLM) techniques and intelligent testbenches, according to Walden Rhines, chairman and CEO of Mentor Graphics Corp.

Rhines made the comments at the DVCon functional design and verification conference, where verification methods for low-power designs were another focus.

Speaking on the topic "Ending Endless Verification," Rhines compared the evolution of IC test with that of verification methods. The challenge, he said, is the same in both cases: the increased complexity of what must be tested vs. the efficiency and sophistication of the tools and methods available for the task.

Optimal design
As test methods have evolved from functional to built-in-test and then to methods for avoiding redundancies in test vectors, engineers have been successful in keeping pace with the exponential growth in design complexity. Moves to eliminate test-vector redundancies have resulted in improvements of two orders of magnitude between 2001 and 2007, Rhines said.

And the International Technology Roadmap for Semiconductors sees the opportunity for another order-of-magnitude improvement by 2013, potentially resulting in tests that are a thousand times more efficient in a span of just over 20 years.

Rhines also pointed out, however, that verification methods based solely on functional tests are compute-intensive, and that without a change in methods it will soon be impracticalif not impossibleto verify a design fully. Moreover, emulation technology and advancements in simulation technology, as well as the adoption of formal verification tools, have provided approximately an order-of-magnitude improvement in verification efficiency. It is thus necessary, he said, to adopt new verification methods.

New era for verification
Rhines said the industry must find a method to stop reverifying what has already been verified by using a combination of intelligent testbenches, formal methods and transaction-level modeling. Early results in the use of intelligent testbenches have shown improvements of around two orders of magnitude over traditional methods, and the two other methods promise similar results.

What is exciting is that the three methods are not mutually exclusive and in fact can, and should, be used together, Rhines said. A verification strategy that takes advantage of all three techniques results in a higher level of test coverage in a shorter period, making full coverage possible in a practical amount of time using economically feasible compute resources.

Although a few of the leading companies are using various combinations of all three techniques, EDA tools have not quite matured to the extent that they can form an integrated system that design verification engineers can acquire today. Progress toward that goal is being made, however. Earlier this week, Mentor Graphics introduced a new version of its Questa verification platform that will support all three methods.

While the integration of all three verification methods into a single, comprehensive package is still a work in progress, the three methods, or components, of Rhines' strategy continue to evolve.

Enabling technologies
Intelligent testbenches are the newest components of the technology described by Rhines, and development work on tools implementing the technology is still under way. The just-introduced version of Mentor's Questa verifica- tion platform includes the InFact module, which supports the creation and deployment of intelligent testbenches. The product allows the automatic integration of blocks implemented at different levels of abstraction into one simulation unit. Engineers using the tool will no longer be required to implement model wrappers that equalize the abstraction levels of all the components in a design in order to verify it. That will not only save time but will also avoid potential sources of errors.

Cadence Design Systems and Synopsys Inc., like Mentor, have offered design and verification tools that support formal methods for a couple of years, and some smaller companies are also active in the segment. In an indication of the formal methods' technological maturity, a number of international standards covering formal methods have been developed and are supported in available tools.

TLM, the third prong of Rhine's verification cost-reduction strategy, is the focus of a modeling standard supported by the Open SystemC Initiative (OSCI), a consortium of vendors and users of SystemC tools. The consortium has just announced that the comments period for its TLM-2 Draft Standard has closed and that the completed standard is expected by June. All vendors of SystemC tools support TLM, although the technique is specific to system-level design and therefore TLM models cannot be directly synthesized into hardware circuits.

Low-power support rising
Verification methods targeted for low-power designs were another focus at DVCon. The trend will be underscored by an announcement this week from Synopsys that its entire design flowfrom simulation and synthesis through placement, routing and circuit verificationnow supports the Unified Power Format (UPF), a low-power design standard developed in conjunction with Mentor and Magma Design Automation.

At DVCon, low-power methods were the subject of a tutorial sponsored by Cadence that attracted more than 100 logic designers interested in using assertions and SystemVerilog in designing and verifying low-power circuits. Technical papers presented by engineers from STMicroelectronics, Mentor and Cadence discussed issues related to low-power design.

In a paper titled "Upping Verification Productivity for Low Power Designs," engineers from ST and Mentor described how the UPF format was used in a test design to verify the format's utility for specifying design intent for power distribution and management in a system-on-chip. "Assertion-Based Verification for Power-Cycled System-on-chip (PC-SoC) Verification," authored by engineers from Cadence, explored new power-cycling design methods to lower the overall power signature of a PC system-on-chip.

The announcement from Synopsys this week directly addresses some of the issues described in the paper by ST and Mentor. With the sponsorship of Accellera, the UPF standard was transferred to the IEEE for official standardization. The IEEE-1801 ("Standard for the Design & Verification of Low Power ICs") Working Group is making strides toward that goal.

Cadence has its own format: the Common Power Format, developed under the auspices of the Si2 consortium. But there are indications that the industry is working toward merging UPF and CPF under the IEEE-1801 Working Group.n

Gabe Moretti
EE Times




Article Comments - Rhines on EDA: End 'endless verifica...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top