Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Processors/DSPs
?
?
Processors/DSPs??

Intel tips 'next big thing' in supercomputing

Posted: 26 Jun 2007 ?? ?Print Version ?Bookmark and Share

Keywords:Intel research? terascale technology? 80-core chip?

Prototype of Intel's 80-core chip

Intel Corp. last week showcased at its Santa Clara headquarters an array of researches and prototypes to dazzle reporters and analysts with what it hopes will be the next big thing.

Among the technologies on display was the "first tera-scale silicon prototype." The 80-core processor, 13-by-22mm, was equivalent in power to a teraflop supercomputer that 10 years ago would have filed a 40-by-10ft room. Paolo Aseron, a hardware engineer at Intel's microprocessor lab in Hillsboro, said, "This is a proof of concept, numbers-crunching monster."

Built using a 65nm manufacturing process, each core has 5Kbytes of cache and two floating-point units. Compared to Intel's quad-core processors today, the prototype has 40 times the processing power, Aseron said.

Tera-scale computing is the future for Intel chips and platforms. The company currently has more than 100 R&D projects worldwide dedicated to addressing hardware and software challenges associated with systems that would be based on processors with dozens of cores.

Justin Rattner, chief technology officer for Intel, said the company's first tera-scale processor, codenamed Larrabee, would be capable of processing "well in excess" of a teraflop of data. The processor is set for release in 2010, but could show up in 2009, he said.

Tera programming and computing
To help software developers deal with tera-scale systems, Intel has developed a programming model called Ct, which extends the programming languages C and C++. In essence, the model deals with the complexity of parallelization, which is spreading the workload of a task among multiple processors to produce faster results.

Ct makes it possible for developers to program as if they are writing applications for one core, Mohan Rajagopalan, a research scientist at Intel's Santa Clara lab, said. Code is optimized for multiple cores when it is compiled, and during runtime.

Intel plans to release a preview of Ct to the open-source community in the near future, Rajagopalan said. "We're still working out the legal issues in making the whole project open source."

In demonstrating possible uses for tera-scale computing, Intel chose video editing and computer game development. The first involved the use of software smart enough to detect patterns within a 90-minute video of a professional soccer game, and extract some of the highlights.

To do that, Intel researchers had to create a model that enables the computer to learn to recognize important plays, much like a spam filter can learn to separate spam from legitimate e-mail. "We can train the computer to detect the highlights based on the model," said Xiaofeng Tong, researcher at the Intel China Research Center in Beijing.

The demonstration involved highlight-extracting software running on a computer powered by an Intel dual-core chip. The next step, which wasn't demonstrated, would add activity analysis, so the system would know the difference between a foul and a goal. Such a system would need an eight-core processor capable of 100 gigaflops. In order to perform action analysis on every play the Intel model would have to run on a 64-core processor, Tong said.

Power efficiency
With tera-scale computing comes the need for power efficiency. For some time, Intel has developed chips that are more powerful, but consume the same amount of energy as previous versions.

To help continue that trend in tera-scale computing, Intel is developing "adaptive circuits" within a processor that would determine the minimum amount of performance required for a task. "We have a brain in the chip," Bryan Casper, principal engineer for Intel's circuit research lab in Hillsboro, said. All power associated with a task is turned down to a "just-enough" level.

A prototype of the technology was demonstrated in a PCIe card with a chip that consumed one-tenth the power of a card with today's chip technology, or 2.7mW versus 20-30mW. Reducing power consumption is critical, given that using today's technology to power a PCIe card with a bandwidth of a terabit per second would require 100W of energy, Casper said.

Cutting power consumption
Outside of supercomputing, Intel is also looking for greater energy efficiency in mobile devices to extend battery life. One area where it is looking to cut power consumption is in wireless communications.

Researchers showed a prototype of a Wi-Fi card with firmware that automatically turned off the power when the card was not in use. The technology also knew when to power up to receive or transmit data packets. Such cards use from 50 to 70 percent less power than standard wireless cards, researchers said.

Intel is also developing technology for server chipsets that would work in conjunction with products from power supply, storage and software management vendors. The server chipsets in conjunction with third-party technology would enable users to cap power use of individual servers, and also know the thermal output of servers so they could be disbursed to avoid "hot spots" that require additional cooling, Milan Milenkovic, principal engineer at Intel's systems technology lab in Hillsboro, said.

One area where that is being done by Intel is in the number of antennas needed to support multiple wireless standards. A device, for example, that supported Wi-Fi, WiMax, a 3G cellular network, and Bluetooth would require eight antennae, Ross Hodgin, technology marketing engineer, for Intel, said.

To consolidate as many antennae as possible into one, Intel is developing a switching device that would change the antenna's radio pattern depending on which wireless standard was needed. The technology would be made available to device manufacturers. "From the end user perspective, they get flexibility, a smaller form factor, and reduced cost," Hodgin said.

- Antone Gonsalves
InformationWeek




Article Comments - Intel tips 'next big thing' in super...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top