Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > T&M
?
?
T&M??

Take distinctive approach to IC verification

Posted: 16 Mar 2007 ?? ?Print Version ?Bookmark and Share

Keywords:co-modeling verification? C/C++ testbench? Hooman Moshar? Broadcom? functional verification?

Moshar: I believe that co-modeling will eventually replace in-circuit emulation.

Functional verification is a big concern for Hooman Moshar, senior director of engineering for the broadband division at Broadcom Corp. His division, which employs some 1,000 engineers, has developed a "co-modeling" approach to verification centered on a C/C++ testbench. Broadcom worked with Mentor Graphics Corp. to develop an accelerator that supports the methodology. It is also a key backer of Accellera's Standard Co-Emulation Modeling Interface (SCE-MI) standard. EE Times spoke with Moshar about Broadcom's distinctive approach to IC verification.

EE Times: What kinds of chips is your business unit designing?
Hooman Moshar: This unit deals with cable and satellite STB receivers, cable and DSL modems, DTV and consumer products like HDTV. Our chips are highly integrated, and there's a very large analog content. We believe that we were one of the first companies to go to 65nm. We are routinely designing chips of 10 million to 100 million gates.

What are your greatest challenges with functional verification?
All of our products have an enormous amount of man-machine analog interface content, including A/V, voice and telephony. We have high-level signal-processing and communications algorithms. An enormous amount of embedded processing takes place in these chips, so layers of software need to be managed.

Because the entire system is on a chip, there is no target environment for it. That lack of a target?that lack of a representation you can build a testbench for?makes it very difficult. It takes away some of the advantages that an in-circuit emulation environment can provide. Also, because the life cycles of our products are short and the intellectual property (IP) that's integrated into them changes, we can never enjoy the concept of golden IP. Everything has to be reverified as we go.

How would you describe your verification setup?
At a high level, it is composed of C/C++ object-oriented testbench. It's entirely transaction-based, and it can connect to a hardware representation of the device-under-test (DUT) or a software representation of the DUT seamlessly. One key attribute is that the system testbench is untimed, and the way it interacts with the simulator or the hardware accelerator is untimed. But it is 100 percent in control. This is the infrastructure we put in place more than a decade ago.

We can think of the high-level testbench environment as a superset of what Cadence Design Systems' Specman does today. Specman allows you to create various traffic scenarios at the clock or transaction level to cover corner cases. Our environment does basically the same thing, but at a much higher level of traffic. A high-level simulation engine running C/C++ takes care of all the scheduling, traffic generation, monitoring, determination of the time, extraction of the data and sorting of errors?it is a very elaborate environment.

You've called this methodology co-modeling. How does it compare with co-simulation?
Co-simulation means the software simulator is running on the workstation, and through a programming language interface (PLI), you're talking to a device or bus-functional model that is working with the simulator. Co-modeling is the other way around. A C/C++ testbench is in control, and it communicates to a simulator or a hardware box via a well-defined set of APIs with transactors.

You've recently made a major investment in hardware acceleration. Why?
We routinely calculate what it takes to use a commercial simulator. If you use all the server platforms available at Broadcom, it would take hundreds of years to verify every single one of our chips. That is not manageable. We need to go through an enormous number of test cases to verify all the traffic. That's where the acceleration comes in.

How is hardware-based verification changing?
A shift is definitely happening. In the past, in-circuit emulation has been the primary usage of emulators. That lends itself to certain products where platforms exist, where you have the emulator plugged into the target and there's a well-defined set of interfaces.

Acceleration is really new in the context of co-modeling. It lends itself to a better way of doing things. When you do not have a target, you have a sheer number of cycles and you have possibly every interface known to mankind that you have to verify. The difficulty was that the EDA folks were not really aware of the best way of approaching it.

What did your work with Mentor Graphics involve?
In the mid-1990s, we started to work with Ikos, which was acquired by Mentor. We have since been working with them to come up with the next generation of hardware. We defined many of the requirements for the latest Mentor Graphics Veloce machines. We have the new Veloce machines in-house, and we are bringing them online.

We believe that the co-modeling methodology is going to augment the chip-level simulation environment in such a way that 20 or 30 percent or more of software simulators could be replaced. All of our SoC-level verification is done without any simulators. It's a C testbench talking to hardware.

Acceleration doesn't help you with analog circuitry, though. How do you verify the analog content on your SoCs?
Analog modules usually have a Matlab or bit-accurate model. That model is used to design both the module itself and the first level of the digital block that talks to it. But when it comes to the chip level, that module testbench is of no use to me. At that level, an abstract C model is designed to create massive amounts of traffic and corner cases in the system.

Is embedded-software verification a challenge?
That's a very good point. There are 1,000 engineers in this business unit. Of those 1,000 engineers, 700 are system and software folks. What we develop in terms of hardware platforms is not only used for verification of hardware but can also be used by the software guys to develop drivers early on.

Are you using formal verification, and does it reduce the need for simulation and acceleration?
We are using formal verification, but I don't believe it is reducing the scope of the work we need to do. It will help you make sure that your IP is golden, but formal verification really does not apply at the SoC level. You have to go through all the traffic scenarios you need to cover.

What's your involvement in the Accellera SCE-MI effort to provide a standard modeling interface for emulators and accelerators?
We are working with the Accellera committee on SCE-MI 2.0. This interface alleviates many of the ease-of-use problems the design community had. We can hide the infrastructure under the hood to make it look more like an RTL simulation environment. That's the next generation.

Do you see other companies adopting co-modeling?
This is a standard methodology, and other companies have started to use it. I believe that co-modeling will eventually replace in-circuit emulation.

- Richard Goering
EE Times




Article Comments - Take distinctive approach to IC veri...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top