Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > EDA/IP
?
?
EDA/IP??

Braving software-to-silicon verification challenges at 45nm

Posted: 05 Oct 2009 ?? ?Print Version ?Bookmark and Share

Keywords:software to silicon? verification challenges? system design tools?

Chip and system developers once considered verification as a secondary activity following the main challenge of design, with the "Designer" playing the central role in a design's success. This notion is firmly turned on its head today, as verification is the biggest component of chip hardware development budgets, schedules, staffing and risk.

With verification complexity growing faster than Moore's Law, compounded by increasing mixed-signal content and advanced low-power design techniques, the importance of verification in the chip hardware development process is certain to increase. In fact, venture capitalists have started focusing on verification costs as a factor in determining which chip startups to fund.

Similarly, embedded software used to be a minor or non-existent deliverable for typical semiconductor devices. At 45nm and beyond, software accounts for a full 60 percent of total chip-development cost, with major implications on how chips and systems are verified. It is no surprise, then, that the International Technology Roadmap for Semiconductors (ITRS) predicts that, "Without major breakthroughs, verification will be a non-scalable, show-stopping barrier to further progress in the semiconductor industry."

The growing role of verification complexity and embedded software in the chip-development process presents many challenges to tool vendors and system designers alike. The four areas that vendors and designers must focus on to address key software-to-silicon verification challenges are:

? Verification efficiency;
? Verification performance;
? Early software development;
? Cross-domain verification.

Embedded software and verification dominate chip design cost at 45nm and beyond.

Efficiency, performance
Chips have become almost unimaginably complex, with dozens of interacting embedded processor cores, accelerators, complex on-chip buses and high-speed external interfaces. This underlying chip complexity poses a major problem for system developers regarding how to sufficiently exercise the effectively infinite state space within a reasonable time, to find bugs that will cause costly respins or potentially kill their chips. The reasons for the exploding verification state space are well known: These designs include pervasive use of advanced power management techniques, tighter integration of analog/mixed-signal components with the digital functions and large amounts of software.

Another associated development is that the combination of huge design costs, larger available silicon real estate and device functionality convergence is causing more designs to adopt a flexible SoC-style architecture, even in "traditionally non-SoC" design teams. This creates a lot of obvious, as well as subtle, problems for successfully verifying these monster chips and systems.

The huge state space of an SoC means that verification can never really be complete; it just ends. Verification requirements for high coverage!exposing intricate corner cases and cross-domain functionality with full debug visibility!ensure that simulation will continue as a workhorse for block, cluster and basic chip-level verification for years to come. New technologies that increase the efficiency of the verification process will become a key ingredient in successful verification efforts. This will force innovation and investment in many areas, including better techniques and tools to simplify the design process and to prevent bugs from getting into designs in the first place; standard methods propagating best practices and enabling reuse; innovative bug-finding technologies; more automated flows for block-level verification and coverage convergence; enhanced visualization and failure analysis capabilities; integrated low-power and mixed-signal verification flows; and domain-specific verification automation.

Another often-overlooked area where innovative technology and automation should reduce the workload of the engineer is the debug, diagnosis and triage effort in front-end verification, which typically consumes over 30 percent of engineering time and resources.

The performance and capacity required to successfully verify chips are now straining verification tools and IT infrastructure to the limit, having an impact on both the cost and the productivity of the verification process. Until recently, tool performance benefited from the seemingly inexorable march of single-threaded microprocessor performance. However, single-threaded improvements are now a thing of the past and multicore (and many-core) throughput-based computing is the "new normal"!this is a major paradigm shift for verification tool providers and system developers. For one, not all verification algorithms lend themselves well to leveraging a multicore architecture, thus a huge amount of algorithmic innovation is required to harness the power of the underlying hardware. Another challenge is the relative novelty of the tools, languages and framework for software development on multicore architectures. The third problem is the large amounts of legacy code in current widely deployed EDA tools. It will be a major challenge for vendors to retool their software and for system designers to architect their verification processes to take full advantage of multicore architectures for maximum throughput.

As more chip features are implemented in software, it will become increasingly important to enable large software teams to start development and validation earlier in the chip development process, prior to silicon availability. The key requirement for software developers is development platform performance!tens of megahertz performance is needed to run the millions of cycles to boot a mobile OS in a few seconds or to validate a USB interface against a real device!combined with model accuracy and fidelity. Today's SoC designs not only have a lot of internal functionality but are also equipped with a wide variety of external interfaces (video, audio, PCIe, USB etc.); verifying the design with real interfaces is also essential for full chip validation. However, between software development, architecture exploration and system validation, designers have many conflicting model requirements, such as time of availability, accuracy, development cost and debug insight. It turns out no one representation satisfies all requirements. Thus, the verification team is often tasked with developing a system prototype as a platform for early software development and system validation, combining the advantages of software- and hardware-based execution. System prototypes enable software teams to start coding and testing up to 12 months prior to silicon availability against an abstract model and then to validate their code against the actual RTL and real-world interfaces prior to tape-out. System prototypes can be built using virtual platforms, consisting of transaction-level models and intellectual property (IP) of system components based on the OSCI SystemC TLM2.0 standard; an FPGA-based rapid prototype built using the actual design RTL and real I/O interfaces; or a hybrid of the two.

System prototypes combine virtual platforms, IP and rapid prototypes for early embedded software development and system validation.

Finally, as design complexity and integration increase, the number of different verification domains to be considered in a single system grows. Almost all chips have some mixed-signal content, requiring not only detailed circuit simulation to verify performance, but also chip-level mixed-signal verification on the boundaries. As more design IP become available as transaction-level models, system developers will drive to perform mixed TLM/RTL verification. As more processors are embedded in SoC, software-driven verification will play an important role in validating on-chip connectivity and compiler tool chains using the embedded processor to drive hardware tests. As more chip features are implemented in software, complex hardware/software interactions will require increased attention to validate not only functionality, but also chip performance and power characteristics.

For tool vendors and end-users, these cross-domain verification requirements imply the need for a comprehensive and coherent tool flow, IP and methodology with tight links between verification domains to minimize information loss and maintain productivity. From an expertise perspective, cross-domain verification tasks will drive the need for more verification generalists who understand enough about multiple verification domains to catch bugs that lie at the boundaries. Having architectural, functional verification, circuit simulation and embedded-software experts will still be necessary, but not sufficient, for ensuring the full system is verified on schedule and under budget. The "tall and thin" engineer will need to work closely with the "medium and broad" engineer (for lack of better terms) to deal with the increased variety and complexity in design techniques and verification tools.

Some say that verification will be the barrier to future progress in the semiconductor industry, but this does not have to be the case. Software-to-silicon verification indeed holds immense challenges at 45nm and beyond for system designers and tool vendors, with multiple paradigm shifts converging at the same time. Close collaboration between designers and vendors can uncover the key bottlenecks in the verification process and enable breakthroughs in tools, methodologies, IP, services and standards.

Companies who recognize and respond to these economic and technical trends in verification will gain important competitive advantages. Engineers who recognize and respond to these trends will also position themselves well for the future. "Designer" may continue to be the most coveted title in a system-development project, but it is the "Verifier" who will increasingly determine the project's ultimate success.

- Tom Borgstrom
Director of Marketing

- Badri Gopalan
Principal Engineer
Verification Group
Synopsys





Article Comments - Braving software-to-silicon verifica...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top