Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > EDA/IP

What makes hardware emulation so compelling

Posted: 23 Nov 2015 ?? ?Print Version ?Bookmark and Share

Keywords:hardware emulation? system-on-chip? SoC? Hardware description language? HDL?

With a bit of slack around the actual beginning, this year marks the 30th anniversary of hardware emulation. Points of reference are the founding of Zycad (1981), IKOS Systems (1984), and Quickturn (1987)!three legendary firms that pioneered hardware-based verification solutions and became commercially successful enterprises.

Hardware emulation has become the centerpiece in the verification tool-box. In fact, it is offered by all three of today's main EDA vendors: Cadence Design Systems, Mentor Graphics, and Synopsys.

It was not always so. For well over 20 years, emulation was considered an expensive aggravation to be avoided as a pestilence. All that has changed in the past 10 years, with two main reasons driving this dramatic transformation:

1. The overwhelming presence of embedded software in the majority of modern system-on-chip (SoC) designs.

2. Improved usability and expanded usage modes of the hardware emulators themselves.

In today's semiconductor design community, no other verification engine offers the versatility of hardware emulation. Hardware description language (HDL) simulators and formal verification tools!as effective as they are when verifying hardware designs!run out of steam when the design-under-test (DUT) capacity reaches a few hundred million gates. Neither can handle embedded software. Electronic system level (ESL) simulators, successful for early software validation, do not have the accuracy required to verify hardware designs. FPGA prototypes, another popular verification solution for system validation and embedded software validation, have poor hardware design debugging capabilities.

We could paraphrase the famous retail sizing approach and say that the hardware emulator has a one-tool-fits-all methodology.

How is this possible, and what happened to this verification technology that has made it so compelling? Let's make a quick assessment of the evolution of hardware emulators over the past two decades, comparing where things stood in the 90s to where they are today. My arbitrary, but comprehensive, criteria include: Technological Foundation, Deployment Modes, and Verification Objectives.

Technological foundation
Circa 1990
Initially conceived using the field-programmable-gate-arrays (FPGA) of the day, the early hardware emulators were constructed using large arrays of such devices. They came in a huge chassis, were heavy and power hungry, and were plagued by rather poor reliability measured by median-time-between failures (MTBF) of less than one day.

Although they could handle the largest designs of the time!the driving factor for devising them!setting up the DUT ready for emulation could take several months. Also, due to the long place-and-route time for the FPGAs, the compilation of the DUT pushed into days. Thus, the popular refrain at that time was "time to emulation."

Targeting two-state functional verification and no timing behaviour, the original emulators offered a speed of execution between 100,000X and 1,000,000X!that is, five and six orders of magnitude!faster than HDL simulators.

Due to the limited internal visibility of the FPGAs, design debug was unfriendly and cumbersome and, at about $5/gate in today's dollars when readjusted for inflation, they were the most expensive verification tools of the time.

Circa 2015
Today, there are two competing schools of thought behind the development of an emulator. One that's been adopted by two of the three vendors uses custom devices designed for emulation. Although they are rather different, these emulators share some characteristics. The alternative embraced by the third vendor is to use commercial FPGAs in vast arrays.

While physical dimensions, weight, and power consumption may still appear large when normalized per gate capacity, their improvement over earlier generations is vast. The MTBF, for example, has been improved by two orders of magnitude.

The DUT capacity has grown by a factor of 1,000, now reaching a few hundred million gates per single box and a couple of billion gates in multi-box configurations. The setup time consumes a few days, while the compilation time!at least for the custom-based emulators!has been reduced to a few hours or less.

Design visibility in modern custom-device-based emulators achieves 100% of all internal gates/nets. This is also true for Xilinx Virtex-based emulators, but it comes with a steep drop in execution speed.

Perhaps most important, the cost of these systems has plunged by two orders of magnitude, reaching down to only a few cents per gate.

Deployment modes
Circa 1990
When first introduced, the hardware emulator was intended to be deployed in only one mode, that of in-circuit-emulation (ICE). In this mode, the DUT is mapped inside the emulator and wired to a socket in the physical target system where the manufactured chip will ultimately reside.

1???2?Next Page?Last Page

Article Comments - What makes hardware emulation so com...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top