Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > EDA/IP
?
?
EDA/IP??

Wrestling functional verification

Posted: 16 Dec 2005 ?? ?Print Version ?Bookmark and Share

Keywords:richard goering? functional verification? logical design flow? acceleration? emulation?

If functional verification already consumes most of the IC logical design flow, as some studies suggest, what's going to happen as chip complexity reaches 10 million or 100 million gates?

The answer is sheer chaosunless the functional-verification process can be made more manageable. Coverage-driven verification can help today, but the long-range answer lies in rethinking both verification and design, some experts say.

As chip complexity grows, there's an exponential increase in the number of things that could potentially go wrong and thus need verifying. Can that be done without hiring armies of verification engineers to churn out directed tests?

Formal verification can provide targeted, exhaustive tests, but it doesn't cover everything. Acceleration and emulation speed up the process, but you've still got to generate and monitor the tests. Faced with a multitude of design styles, tools and verification techniques, one point is clear: Designers must have a plan.

A verification plan starts by identifying what portions of the design are going to be tested. It identifies input scenarios to apply to the design under test and calls out tough corner cases that might not be found by simulation. It also does, or should, set forth a plan to measure progress by applying coverage-driven verification.

Engineers today are most familiar with code coverage, which checks to see if there are unexecuted areas of code. An emerging and more difficult technique is functional coverage, which can be used to check various corner casesensuring, for instance, that a FIFO actually empties and fills.

With functional coverage to provide feedback, random test generation becomes more practical and tougher bugs can be found. Finally, the growth of assertion-based verification has given rise to assertion coverage, which checks to see if assertions actually fired.

Higher abstraction
But that's just a start. To really manage the verification process, some observers say, it will be necessary to move to higher levels of abstraction and start with a system-level rather than a block-level view. Others say the design process itself will have to be improved so there are fewer bugs in the first place.

As described in Janick Bergeron's book Writing testbenches: Functional Verification of hdl Models, a verification plan is a written specification that outlines the verification effort. It partitions the design, defines levels of granularity for verification, sets forth strategies and tools, and answers what may be the toughest question of all: How do you know when verification is done?

Attacking 'risky' elements
The verification plan "should include everything you suspect is risky about your design," said Bergeron, scientist at Synopsys Inc. and moderator of the online Verification Guild discussion forum. This makes it possible to identify corner cases and the coverage points that verify them. Right now, Bergeron noted, creating a verification plan is "pretty much a manual process."

Such plans are typically text documents, said Jay Lawrence, lead architect at Cadence Design Systems Inc. "There's very little automation of that process. I think there's a great opportunity for tools to help," he said.

Verisity Design Inc., which Cadence is in the process of acquiring, has taken a step in that direction with its vmanager product. Ziv Binyamini, senior VP of verification process automation at Verisity, noted that VManager supports an executable verification plan format. This allows tools to read and write to the machine-readable plan.

"VManager collects and controls the runs of many different verification tasks, including simulation, formal verification and acceleration, and then collects coverage data from different sources, including functional, code and assertion," said Binyamini.

There is little point in having a verification plan without having a coverage-driven methodology to monitor it, experts say.

Binyamini put it bluntly: "Doing verification without coverage is like driving a car with your eyes blind. You cannot go beyond a certain point without having a way to know where you are going."

Coverage-driven verification, said Bergeron, is the answer to the primary challenge he sees facing verification teamsensuring the quality of the design. "Customers run thousands of tests, but the quality is not there," he said. "They're unable to exercise corner cases. The only way around that is to close the loop with functional coverage to verify that you've exercised every feature and corner case you've set out to do."

Most people today are using code-coverage tools, Bergeron said. These tools instrument the code with checkpoints to determine if constructs were executed. They commonly offer statement, path, branch and expression coverage.

But code coverage can't identify missing code or functionality. That's why some verification teams are turning to functional coverage, which involves the insertion of "coverage points" to ensure that various things do or don't happenthat a FIFO empties and fills, that a particular set of instructions goes through a pipeline, that packets are transmitted across all channels in the design.

Without functional coverage, said Richard Ho, principal engineer at Mentor Graphics Corp., you're pretty much stuck with directed tests. And those can only look for what you think you're going to find. Functional coverage enables a "constrained random-driven" simulation methodology in which random tests can be applied and then interactively modified, given feedback from coverage metrics.

Yet both code and functional coverage have flaws, said verification consultant Brian Bailey. "Code coverage is telling you if you can control your design, but it doesn't tell you you've got that line of code for the right reason," he said. "Functional coverage is a measure of observability. It tells you that you saw this event, but doesn't tell you it happens for the right reason."

Difficult metric
The problem with functional coverage is that it's a manual process. Someone must insert the coverage points using constructs found in languages such as SystemVerilog, Property Specification Language (PSL) or "e" from Verisity. You then need a simulator that can interpret those constructs.

"It's a tremendous job," Bergeron of Synopsys said. "You can simplify it using templates, but the implementation is difficult."

There's also simulation overhead, typically in the 10 percent to 30 percent range, said Ho of Mentor Graphics. "But it's a question of taking the overhead vs. not knowing what you're doing," he said. "In the macro view, you're saving time."

Functional coverage is usually the job of verification engineers, Ho said, but they often don't have the expertise to insert the coverage points accurately. Designers may not be very motivated. In a paper at the recent DesignCon conference, Ho suggested an approach that relied on the use of assertionswhich are typically inserted by designersto identify functional coverage points in a design.

"Assertions and coverage are two sides of the same coin," Ho said. While assertions check for bad behaviors, Ho said, functional coverage points look for desired behaviors. The paper shows how designers can insert coverage points automatically using an assertion library, derive transactional coverage points and use properties in PSL and SystemVerilog for both assertions and coverage.

Beyond using assertions for coverage is the notion of "assertion coverage," which appears to have differing definitions depending on whom you talk to. Ideally, said consultant Bailey, it would tell you that you've successfully verified something from a starting point to an ending point. In practice, he said, tools pretty much just tell you whether an assertion triggered.

As Ho pointed out, an assertion that never fires is "vacuously" true. What's needed is a way to ensure the assertion actually got into a state where it was checking something relevant. That means verifying that a complex set of preconditions actually happened.

Cadence's Incisive verification suite, said Lawrence, can report the number of times an assertion was one state away from failing. It can also determine whether an assertion triggered and whether preconditions were hit. But assertion coverage typically works only for "positive" assertions, he said, not those that state a given behavior should never happen.

Going formal
Harry Foster, chief methodologist at formal-verification startup Jasper Design Automation, sees it this way: "An assertion checks the 'what,' while coverage checks the 'how.'" While assertions specify what must be checked, coverage points describe what input stimulus is needed to properly activate key functionality.

Foster, one of the pioneers of assertion-based verification, says he's not sure what "assertion coverage" means. But one possible meaning, he said, is whether there are enough assertions to check a design. This is sometimes called "assertion density."

Formal verification, he said, takes away the need for functional verification because there's no input stimulus. It's built on a mathematical model that tries all possible combinations. "What you do need is coverage for writing assertions," he said.

More abstract
In the long run, some observers believe, raising the abstraction level is the true key to facilitating verification management. Bailey, an advocate of ESL verification, argues that the industry has got it all backward by starting with block-based verification. "It's a chicken-before-the-egg situation, where we're trying to verify something that we don't know we want," he said.

Gary Smith, chief EDA analyst at Dataquest, has long advocated the "intelligent testbench," a kind of supertool that would automatically partition a design and invoke the verification tool best suited for a particular unit. He thinks such a beast will emerge in the ESL verification space, although the architecture of the tool is not yet defined.

But the real solution may involve the design process rather than verification. At several DesignCon panels, participants and audience members said it's time to build better designs in the first place.

"We focus on finding bugs vs. doing it correctly to begin with," said Jasper Design's Foster. "We should ask how many bugs we can prevent during the design process, not how many were found in verification."

There are differing views on whether, and how, the design process can be improved so as to diminish the need for verification. Foster believes that formal methods hold the key to a "prove as you design" methodology. If he's right, the ultimate solution to the verification management problem may be something very different from a better verification tool.

- Richard Goering
EE Times




Article Comments - Wrestling functional verification
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top