Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Controls/MCUs
?
?
Controls/MCUs??

Scaling debate stalks chip conference

Posted: 12 Feb 2002 ?? ?Print Version ?Bookmark and Share

Keywords:solid-state circuits conference? logic? memory? transistor? processor?

From the wide range of panels and tutorials at the International Solid-State Circuits Conference ending Wednesday (Feb. 6), from the myriad papers, from the conversations in the hotel halls, a single gnawing issue grew from hints and whispers to a groundswell of concern: Will IC technology continue to scale?

From one perspective, the future holds only promise: more logic, more memory, more speed. And the papers were there last week to prove it. Sun Microsystems Inc. reported a 1.1GHz, 87.5 million-transistor Sparc processor in 130nm CMOS. Fujitsu Ltd described a 533MHz media processor using 10.4 million transistors in a 110nm CMOS process. And, of course, memories grew apace.

But beneath this optimistic veneer lay some curious hints. Many of the advanced digital chips used relatively conservative processes. Intel Corp.'s much-touted paper on the McKinley CPU, for instance, was based on a conservative 180nm process, as were many of the other aggressive digital designs. Clearly, something is going wrong beneath the level of abstraction that presents the designer with ideal logic devices.

Certainly there was no lack of interest in further scaling from the systems side of the equation. Fred Boekhorst, senior vice president of research at Philips, told his audience that power, die size and cost need to be pushed down dramatically for what he called ambient-intelligent systems to become ubiquitous.

"These systems will adapt to the user in a context-aware fashion and will differ substantially from contemporary equipment," Boekhorst said.

The concept seems plausible: billions of tiny devices packed into hundreds of millions of systems handling any number of tasks. But Boekhorst cautioned that such systems will serve many masters at once, straining clever design techniques.

"At one end of the spectrum, low-data-rate sensor and actuator control signals form the interface between environment and system with data rates as low as 1Gbps or less," he said. "At the other end, there will be a need to interface large HDTV displays with data rates as high as 5Gbps."

Ambient intelligence will require subsystems that span nine orders of magnitude in bandwidth or computational load and six orders of magnitude in terms of allowable power dissipation, he added. Many will have to trim power consumption to just 1005W, whereas a contemporary Bluetooth device consumes at least 50mW, he noted.

And where Bluetooth designs now are targeting the magic $5 cost point, ambient-intelligent devices must scale to pennies. "Within a five-year time scale, a bill of materials comprising 20 cents for the IC, 10 cents for the battery and 10 cents for the packaging seems plausible," Boekhorst said. That threw down a gauntlet that the chip design community was in no hurry to pick up.

Two perspectives

Both in the opinions they voiced and in the papers they presented, chip design attendees segregated themselves into two irreconcilable camps: optimists and pessimists. But this apparent disagreement reflected not so much differing views on the facts but differing attitudes toward them. The future of the IC, this year's conference suggested, depends greatly on the level of abstraction from which you visualize the technology.

Digital-systems designers are privileged to hold perhaps the highest level of abstraction. To them, even the most complex ICs comprise arrangements of registers and logic, all idealized devices that produce unambiguous digital signals with perfectly predictable time delays. Memories and mixed-signal functions are black boxes that can be relied upon to do their jobs, however messy they might be inside. Yet even here, the nuts and bolts of design are only getting more complex. Wires aren't just wires anymore. As scaling continues, they are wires with capacitive and inductive paths to other parts of the chip. And transistors aren't just transistors.

As ironically stated in the title of a keynote address by Texas Instruments Inc. technology vice president Dennis Buss, MOSFET switches become MOSFET dimmers. Buss and numerous other speakers described a world in which lower and lower operating voltages are gradually turning switching transistors?and the gates, flip-flops and registers made from them?into far-from-ideal devices.

Numerous problems are striking all at once as devices scale. Channels get shorter, gate dielectric layers get thinner and metal lines turn into tall, slender walls. One of the first issues to show up, and the one most discussed at ISSCC, was leakage current in the transistors. The effect stems from two sources. First, as threshold voltages get lower, the off-state current of the transistors increases?you can't ever quite turn them off. This electrical incontinence is what Buss referred to in his title.

Second, as gate dielectrics get thinner, the leakage across the supposed insulator increases as well. The result is a switching transistor that switches between a lot of current in the on state and a little less current in the off state. Across 5 million or 10 million transistors, the impact on energy consumption can be appalling.

For a given channel length, the current consumption of a 90nm, 1.2V circuit is nearly two orders of magnitude higher than for a 250nm, 2.5V circuit, Buss warned. The traditional dopants used to enhance manufacturing yields, moreover, only exacerbated the problem.

"I'm here to tell you," said Buss, "the party's over!"

ISSCC sessions were replete with echoes of Buss' warning. Paper after paper described low-voltage, low-power designs with aggressive techniques for controlling leakage current. Among the numerous Intel papers, multiple supply rails and the use of body bias to pinch off the channel current in leaky MOSFETs were the favored techniques. Among other vendors, variable supply voltages and power-gating techniques were used.

But power dissipation was not the only goblin lurking in the darkened meeting rooms of ISSCC. With the reduced voltages and increased leakage current come lower noise margins in the transistors themselves?this on top of the increased noise from the interconnect.

To digital designers, even at the transistor level, noise can often be modeled as increased delay. But to analog designers, noise is the inseparable evil twin of the signal: It strikes at the essence of analog functionality. And it is among analog designers that the most objections to further scaling were raised.

"In terms of dynamic range, low-voltage scaling buys you nothing," insisted professor Asad Abidi of the University of California at Los Angeles, an RF expert who participated in both a Tuesday (Feb. 5) panel and the Sunday (Feb. 3) paper presentations. RF circuits designed to ensure the sound operation of RF amplifiers, filters, mixers and oscillators depend on circuit techniques that operate best with long-channel, thick-oxide devices. Those transistors operate best at 2.5 V or higher, Abidi insisted.

Widely touted receiver architectures, like direct-conversion zero-IF schemes, moreover, can barely overcome 1/f noise at RF frequencies with low-voltage supplies and do not meet GPS specifications, Abidi said. A better receiver architecture was a dual-stage indirect downstage converter?but such a scheme depends more on traditional filter architectures, which need supply voltages higher than 1.8V, he pointed out. It is likely that the portable RF devices of the future will reflect mixed analog-digital architectures, some using low-voltage CMOS and others long-channel devices, he conceded.

It was no coincidence, then, that some of the most advanced RF, analog and mixed-signal achievements at the conference were using process geometries as coarse as 300nm.

Nor was RF the only dissenting discipline. Buss' nemesis and provocateur, Analog Devices Inc.'s Lew Counts, sparked a heated exchange by reminding those attending the Moore's Law panel session that Buss' own integrated cell phone presentation contained "asterisked items" (such as the power amp) whose signal-launch functions could not be performed in CMOS. In no uncertain terms, Counts claimed that Moore's Law is poison for analog designers.

Another group of designers works at the same level of abstraction as analog designers: full-custom and library cell designers. And there were hints among the ISSCC presentations that this group also faces growing challenges. One such indication is the length to which custom designers are going to make things work right. In the Intel McKinley paper, for instance, designers reported overcoming capacitive coupling between adjacent metal lines by actively injecting inverted noise into the victim nets. Critical parts of the design relied on domino logic.

A memory oasis

One group of designers was surprisingly sanguine about the shrinking geometries. Among memory vendors, the talk was of successfully scaling DRAM, SRAM and even the notoriously stubborn Flash cells to new geometries. For the most part, memory makers cited CMOS scaling as absolutely essential for the advancement of what they did.

Tuesday (Feb. 4) panelist Alex Shubat of Virage Logic Corp., for example, saw lower voltage as essential for increasing memory cell density.

Two papers announced Flash memories in the 120nm range. And while there were no major DRAM density breakthroughs, DRAM developers outside the meeting rooms spoke confidently of current bit-cell architectures' scaling to at least 90 nm.

There was even an interesting hint at an entirely new direction for DRAM, with a group from Toshiba Corp. reporting success at using the floating-body effect in silicon-on-insulator transistors as a charge-storage mechanism in a one- transistor SOI DRAM.

With processes moving steadily toward 90nm production within the next year or so, a more complicated picture of the industry is emerging. Instead of all designs moving in lock step from one process node to the next, we are seeing designs finding a comfortable geometry and sticking with it.

RF and fast, precision analog appear to have hopped off the bandwagon at about 300 nm. Many SoC designs seem to be quite content working with the relatively stable environment at 180nm. Yet process engineers are pressing on, with 130nm wafers in production and 100nm to 90nm on the way.

But if the papers and talks at ISSCC are any indication, there will be a further winnowing out of design teams as the industry moves through those process nodes. Increasing leakage and noise?among the issues that caused analog designers to start bailing out several nodes ago?may make it impossible for library developers to hand over a library of ideal digital devices to SoC design teams.

The libraries at these levels may either contain highly complex characteristics?beyond the modeling capability of today's high-level design tools?or may suffer significant reductions in density and performance in their attempt to make 90nm transistors and 200-nm metal lines appear to behave normally.

We may, in short, be entering a world of digital haves and have-nots, where huge, superbly staffed design teams produce 1GHz processors in a 180nm process, while COT designers working with commercial libraries develop 333MHz designs in 100nm CMOS.

? Ron Wilson and Stephan Ohr

EE Times





Article Comments - Scaling debate stalks chip conferenc...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top