Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > EDA/IP
?
?
EDA/IP

DFM demands holistic approach

Posted: 16 Mar 2007 ?? ?Print Version ?Bookmark and Share

Keywords:DFM? manufacturing's impact on design? DFM techniques? DFM for 45nm designs? DFM tools for 45nm?

Through years of IC process evolution, manufacturing's impact on design could be addressed using simple design rules and basic electrical models. That made it easy for manufacturing to communicate with design. Simply follow the design rules and simulate with the models, and chip yield is guaranteed.

Starting with the 90nm node, this concept was put to the test. While traditional design rules are still critical to the success of a 45nm design, they are being augMENTed by statistical yield models that go well beyond a simple pass/fail check. These methods include techniques such as critical-feature analysis, critical-area analysis, litho-friendly design and advanced simulation. All are used within an infrastructure that manages the interaction among different techniques to holistically improve design quality. By combining all of these techniques, designers can arrive at a comprehensive picture of design quality?one built on the most advanced rules-based and models-based analysis available today.

With advanced processes, designs that are verified "design-rule-checking clean" can still result in poorly-yielding or even non-functioning silicon. Even though a particular design element or configuration in and of itself can be physically manufactured, in the context of a specific full-chip layout, the same manufacturable element may dramatically increase the probability of failure. For example, many designs today are automatically generated specifically to meet the minimum design constraints, with minimum width and spacing. This is believed to help keep the total design size as small as possible. But although a specific placement of a minimum-spec configuration may be manufacturable, having many such situations increases the chances of failure on that chip.

The next technique is to analyze the layout regarding the widths, spacing and densities to assess where problems are likely to occur and determine how much risk is associated with each one. Proper analysis requires comprehensive reporting and visualization to help direct the designer to the most important issues or drive the enhancement tools to make the right changes.

Manufacturing determines the yield models and the information needed to communicate design to silicon interactions. It is how the designers use these models that directly affect the yield. The difficulty is that each step in the manufacturing process influences yield differently. Thus, you need a system that catches the known challenges of critical area and lithography along with the lesser-known factors of resists and etch.

However, a generic geometric processing engine that supports both rules and models is critical to identifying yield-limiting structures. This may include mathematical models that describe the variation in fail rates of recommended rules or defect-density distributions for particle shorts and opens. This information is released in technology files that support recommended-rule and critical-area analysis.

Recommended-rule analysis identifies critical features that determine the impact of all sources of yield fallout: random, systematic and parametric. Besides random particles, these factors can address the impact of resist and etch on the yield.

More advanced critical-area analysis determines the areas of the design where a particle of a certain type and size would cause a short or open in the layout. Because data overload is one of the obstacles in design-for-manufacturing (DFM), critical-feature analysis techniques should automate the process of sifting through the data. Based on this analysis, designers can determine the worst region of the worst-impacting check, and run a drill-down to statistically weigh individual occurrences of the issues within that region to further prioritize the critical features. Through the variety of reports and visualizations, the design team can determine the design practices or automated design tool behaviors that affect the yield of their products most. This helps them make informed trade-offs with yield, area and performance.

Predicting the effects
Using litho-friendly design (LFD) techniques, a layout designer can predict the effects of process variation on the printability of a specific chip, which makes it possible to design for improved robustness so that the chip is much less sensitive to process window variations.

Litho-friendly design checking performs three important tasks:

  1. Gathers data about how the design will print at a range of possible dose/focus conditions, not just at the optimal settings;

  2. Predicts specific failures or potential yield inhibitors;

  3. Assigns each portion of the design a "manufacturability score," which reflects how well that portion will manufacture given the specific process window.

Data about the layout and its likely response to manufacturing variations is gathered through advanced techniques that incorporate process variation bands. A PV band is a geometry that shows how printing responds to process variations. It represents the area within which a feature will print as the process conditions vary. Calculating the silicon-printed image at various process conditions and combining the resulting images into a composite is the method used to create the PV band. The PV band predicts yield failure in several ways, including the four most common failure configurations?pinching, bridging, area overlap and critical-dimension variability. The predictions of configurations are compiled into LFD rules. The designer uses these LFD rules during the design process to analyze the flagged areas.

DFM tools can improve cross-block gate variation.

LFD failure rules identify areas where variation in printing in response to process variation is likely to cause critical errors from a manufacturing, timing and power standpoint. Typical design rule violations can be fixed by edge movement or morphological changes (clipping corners).

Several DFM tools are already known to directly affect timing analysis for the 45nm technology node. Among them are those that simulate lithography and etch effects over process windows, add vias to reduce random defects, change wire routing to balance metal densities, and use chemical-mechanical polishing (CMP) models to better predict resistance and capacitance values.

Lithography and etch simulators can now extract transistor parametrics based on manufacturing parameters. They can also be used to highlight design areas where the printed image varies too far from the drawn one.

This enables the designer to adjust the design so that it will be robust over the process window regarding predictability.

It also allows the advanced designer an opportunity to create context-independent designs. This is extremely important for design teams to understand, especially at the small-block level. It follows the same methodology as density learning in 130nm technologies: The earlier the issues can be stabilized in the design flow, the faster the final stages of design will complete.

Transistor and via layout for DFM should follow hierarchy methodologies used for timing closure. This way, little will need to be revalidated at the full-chip level. Experiments and library designs are showing that at 65nm layouts, this is possible. More ideas are still coming forward. As with the density requirements in 130nm processes, the new DFM tools will find creative implementations to make better, more robust designs without breaking the router and timing-closure flows.

Interdependent DFM
What may not be apparent in the discussion of different DFM techniques is the fact that they cannot be done independently of one another, where the expectation is that the final result will produce the highest-yielding silicon.

The infrastructure required to make trade-offs among the different techniques and to determine the optimal approach should be one where the actual software takes into account the implications of other DFM issues. The idea is to create a holistic approach to DFM for the design and analysis flow.

As the challenges of DFM increase, the industry finds itself needing more techniques. The situation is similar to verifying timing in SoC design: There's the digital part and the analog section at different levels of extraction. However, to verify that it all works together demands a mixed-mode, multiple-level simulation environment. This allows the designer to identify the type of analysis needed and to provide a system that allows them to play together. In the yield scenario, rules- and models-based approaches coexist, and the analysis software determines the best approach for every unique situation.

- Jeff Wilson
Product Marketing Engineer, Mentor Graphics Corp.




Article Comments - DFM demands holistic approach
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
?
?
Back to Top