Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > EDA/IP
?
?
EDA/IP??

Cadence CTO mulls over CAD architecture, EDA

Posted: 18 Sep 2006 ?? ?Print Version ?Bookmark and Share

Keywords:Cadence Design? EDA? CAD architecture? EDA tools? nanotechnology?

Vucurevich: I'm hopeful that the innovative energy in the industry is still alive and well.

As senior VP and CTO of Cadence Design Systems Inc., Ted Vucurevich is paid to think about the future of EDA. Vucurevich sat down recently with EE Times' Richard Goering and discussed developments ranging from new CAD architectures to EDA tools for nanotechnology-based systems like labs-on-a-chip.

EE Times:What concerns are foremost on the minds of your customers?
Ted Vucurevich: The stability of the design environment, the predictability of engineering organizations to produce products on schedule and the overall costs associated with programs to get designs into the market.

What will help design environments become more stable?
People ask us, 'Why am I worrying about tool integration? Isn't that your job? How do we team up on this problem in a different way than we have in the past?' You have to understand what a customer environment looks like, with all the issues and challenges of design teams, and provide an integration platform that allows them to be more productive.

Customers must say, 'You are responsible for the backbone of my design technology offering. To get my business, you have to allow me to differentiate in some way, whether through my own internal capability or another piece of technology.' It has to be a differentiation interface rather than an integration interface.

If we're going to take on more of that role, we have to have better ways of interacting in 'coopetition,' if you will. We need a way of OEMing to each other while maintaining competition at different levels.

Will the OpenAccess database play a role?
Without a data model for the information that needs to be shared, other discussions about higher-level integrations and APIs will not happen. OpenAccess was an early part of the strategy to make at least a portion of the industry more capable of integration. We've reached a point where it's stable enough technologically. And it's adopted by enough third-party people, so there will be more utilization.

You talked about the need for a next-generation CAD architecture. What do you mean by that?
Today's model is a very much structured decompositionsynthesize, place and route. This waterfall model of optimization has advantages. But when the complexity of what you're trying to optimize rises, the waterfall model starts to fall apart. In my opinion, we'll be able to get through one or two more process nodes with that kind of decomposition.

We need to move to an approach where technologies like synthesis, placement and routing are being refined in the process. It's almost like a peer-to-peer relationship: Synthesis technology is a peer-to-placement technology, which is a peer-to-routing technology. I can be inside any of these technologies, asking questions of the other one at any point. So I could be in the detailed routing phase and discover that I have a placement problem, and that I need to re-invoke placement and repair the problem. Today, I can't undo a presumption from an earlier state.

Also, if I have an incremental change to the input, do I get an incremental change to the output? Many of the algorithms being used today do not have that characteristicsynthesis being the poster child example. If I add a few lines, I can end up with an enormous amount of difference in my output.

Does this architecture extend to verification?
I think it's everything, but it breaks into two parts in terms of the time frame. There's implementation-space integration and aggregationclearly, that's the one in our future. But there's a similar thing happening around system design. If you look at power today, where is it specified? Where does it say that it's valid to go from a shut-off state to a turn-on state or not? It's arbitrary. That's why with our Power Forward initiative, we said, 'We have to get an organization around the information.' We'll try to drive changes in that space from the bottom up. Top-down, it's being driven by relationships among multiple disciplines, which are being integrated from a system-specification point of view.

What disciplines are coming into integration?
In the near space, it's hardware and software. A lot of effort is in that space, but it will continue to drive and evolve to what we see as the right partitioning of the design process. And that's going to cause a change. We saw transaction-level models become real. As they become part of the design step between abstract specification and implementation, you can see how things are restructuring around it.

In the longer term, we will play a role as integrators of multiple disciplines to produce microsystems and nanosystems. We will see additional complexity as additional domains become part of the picture. They'll converge on the electrical/software system as the integration point.

How will hardware and software design move closer together?
The first place is in the intimacy with which the two must be designed together and validated. Validation is probably the bigger part today. We can decompose a system reasonably well, but how do I validate that I've done the right thing?

Tremendous progress has been made on the design side via Alberto Sangiovanni-Vincentelli's work in the Metropolis program at the University of California, Berkeley. You have specifications coming from multiple disciplines, and you have to interpret that somehow. The Metropolis work gives us a formal basis for that.

Transaction-level modeling is probably the right point where hardware guys specify something in details so software guys can have enough performance to validate some primary elements of the interaction. But I also think some of what we've learned to deal with in the formal sense for hardware verification can be applied to the software world. Watch that space with new technologies coming out of the traditional EDA industry.

Design challenges for nanosystems also include mechanical, fluidic, chemical and biological components. What's EDA's role?
It will follow the pattern that it followed for electrical systems. The selling point will be models that allow us to move from scientific experimentation to engineering processes. And I don't mean high-level models. High-level models are necessary, but they aren't the models necessary to engineer integrated systems, and that's the thing I think we're in a unique position to provide.

Will EDA vendors have the necessary expertise for that?
Not out of the box. First, we have to understand the different spaces. We're the only industry that has ridden an exponential curve in integration complexity for 20 years. Thus, our techniques and our approaches to modeling in that kind of environment are quite mature. The other guys have phenomenal domain-specific knowledge, but the fundamental question of what you need to scale and integrate is a question we can answer.

What kinds of systems might the EDA community help with?
The lab-on-chip is a great example. A lot of labs-on-chip use DNA microarrays. You have to have something that prints a set of DNA patterns on a substrate. Advanced labs-on-chip might have some kind of microfluidic integration, as well as electronics and electromechanics to control delivery to the reaction site.

From an engineering process, you have electromechanical, microfluidic and software design, plus information technology. If I have good software techniques for information mining, I trade off software techniques for the accuracy and quality of the biological experimentation. It involves making trade-offs in different domains.

How is EDA doing this year?
There will be separation between people who have claimed DFM capability and those who actually deliver it. More will be shown in the verification and modeling space, and verification management is going to be important.

I'm hopeful that the innovative energy in the industry is still alive and well. There are people out there who have some very novel ideas. The fact that the big guys spend huge amounts of money at Design Automation Conference has somehow overwhelmed the rest of the messages going on. This year, we'll be doing our part to make sure the other guys get some air time.

- Richard Goering
EE Times




Article Comments - Cadence CTO mulls over CAD architect...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top