Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Controls/MCUs
?
?
Controls/MCUs??

Engineers work on real-time control of artificial limbs

Posted: 16 Oct 2007 ?? ?Print Version ?Bookmark and Share

Keywords:control signal? artificial limbs? sensors?

Someday, users of artificial hands may be able to play the piano, thanks to implanted devices that read and harness subtle control signals in their brain. Researchers at last August's IEEE bioengineering conference in France reported advances in neural technology as well as the hurdles still ahead for delivering such capabilities.

"A prosthesis revolution is under way, and a lot of the mechanical problems are getting solved," said Nitish Thakor, a professor of biomedical engineering at Johns Hopkins University, speaking at one of two workshops on neural systems. "Now the challenge is linking prosthetic devices to the nervous system to control them in a real-time fashion."

The overall market for neural prosthetics is valued at $2.8 billion, estimates Daryl Kipke, a researcher from the University of Michigan who described work on new microarray electrodes to monitor and control brain functions.

The Defense Advanced Research Projects Agency is funding work to drive rapid improvements in the mechanical aspects of artificial limbs. That is fueling a need for better electrodes, electronics and algorithms to capture control information from the brain to drive the mechanical devices.

"The whole system of a prosthetic hand is a vast research area, with many fields involved," said Thakor. He described advances in algorithms that could let someone control basic functions, such as picking up a glass, with signals from an electroencephalogram (EEG) monitor worn on the scalp. Finer controls, such as playing a piano, would require a high-performance neural microarray sensor implanted in the brain-a development that's on the horizon, Thakor said.

'Brain-computer interface'
"The buzz is that neural technology is today where cardio technology was 20 or 30 years ago," said Kipke. "The brain-computer interface is about to be defined as we come to understand its components."

Nitish Thakor envisions neural implants as drivers of artificial limbs.

Thakor said his group has been able to achieve 99 percent accuracy in correlating the flexing of a single finger with signals from as few as 30 neurons in the M1 area of the brain's motor cortex. "But we will need to monitor many more neurons to manage more-complex tasks" and multiple fingers, he said.

Scalp-worn EEG sensors with as many as 128 electrodes do not appear to be adequate to separate the source material into control signals for an individual finger, Thakor said.

"There appears to be no specific linear correlation of signals to the flexion movement of multiple fingers. The data is all jumbled up," he said. "So my bias is to shift the problem to mathematical algorithms because the electrical problem is so daunting."

Researchers are using various worn and implanted devices to read and analyze brain signals at frequencies ranging from 1Hz to 10kHz.

"It will be an interesting debate over the next several years to determine which of these techniques is best. Perhaps all of them are good," Thakor said. "Ultimately, the decision of what to use will be based on ethical, surgical and clinical issues, although we as engineers may want to decide based on what gives us the best signals."

Key issue
A key issue for implants is how best to capture signals. Electrode arrays are used extensively today for short-term work in animals, but researchers foresee improvements that could make the devices practical for long-term use in humans.

Thin electrode arrays extend deep into brain tissue and link to signal-processing modules just under the scalp.

"I think we can cross that gap," said Kipke. "The move of brain-computer interfaces from academic to commercial technology is critical in transferring this technology to clinical use."

Kipke is chief executive of NeuroNexus Technologies, a startup spun out of his research center three years ago to develop silicon microarrays for use in humans. Eighteen months ago, the startup began developing polymer-based arrays as well.

Products aimed at short-term diagnostic use in humans will go into testing in less than a year, said Kipke, but arrays geared for use in long-term implants will take longer to bring to market.

Other startups, including Northstar Neuroscience and Neuropace, are developing similar arrays.

Microarray developers need to find ways to reduce both damage to cells and the immune system when the arrays are inserted into brain tissue.

Kipke's group has developed a fast insertion process that can minimize cell tearing, which creates cellular debris and reactions based on chemicals released. The center is training neural surgeons in its array-insertion techniques.

Separately, researchers are making progress creating arrays that can release chemicals to suppress immune-system reactions to the arrays. Those reactions generate cells that typically encapsulate the arrays and thus can degrade their performance.

Kipke's group is also developing more-effective array structures that measure as little as 50?m across and that include lattice structures about 4?m thick. The designs are being prototyped for evaluation.

The length of time array implants can record useful information varies widely, from one day to one year. However, some researchers have been able to get continuous recordings for up to two years in arrays implanted in monkey brains.

Kipke said he thinks minimizing bleeding on the surface of the brain is one key to long-term use. "I am bullish that there is a solution, but I don't know exactly what it will be yet," he said.

Daryl Kipke is developing microarray implants for use in humans.

The signal-processing chips linked to the arrays are unable to provide robust real-time performance on very tight power budgets.

Maria Chiara Carroza, a professor of biomedical engineering at the Scuola Superiore Sant'Anna, a public university in Pisa, Italy, reported on the subsystem requirements for the artificial limbs it is developing. "We have a window of 70ms to make some decisions, so we need a controller that is really fast," she said.

Steeper requirements
Performance requirements will only get steeper, said Shahin Farshchi, a researcher in the electrical engineering department at UCLA. Neural engineers want systems that support as many as 100 channels, sample at rates up to 10,000Hz and offer resolutions up to 12bits.

Researchers require such performance because they must use statistical methods to analyze reactions from large populations of neurons. That's because any given behavior appears to come in response to what can be a wide variety of patterns in how groups of nerve cells fire signals.

"We may need to capture recordings from several thousand neurons simultaneously, which is beyond our current capabilities," said Justin Sanchez, a University of Florida neural prosthetics professor who helped organize one of the workshops. "The brain doesn't average signals over several trials to determine a response."

Sanchez reported on the use of echo-state networks, a form of neural network, to achieve a 15 percent improvement in the correlation between a perceived neural pattern and actual behaviors.

Farshchi described a sensor network he helped create to process and wirelessly transmit brain signals from research animals. It uses a Chipcon CC1000 916MHz radio and a MCU and flash chip from Atmel.

The UCLA device consumes about 100Mwsignificantly less than most digital designs but more than competing analog approaches.

'Holy grail'
"The Holy Grail is to use this as a sensor mote, but until we lower power consumption, we can't do that," Farshchi said. The design currently requires an AA battery, but the group wants a component that can be run from a button cell battery.

The fundamental issue of what information neural systems need to analyze is still a matter of debate. Some researchers called for processing of sensor feedback from an artificial limb to help determine its actions.

Carroza said today's researchers lack sufficient data on the sense of touch, which defies quantification. Today's sensors only provide force data, she said.

The debate over data sources underscores a fundamental paradox in neural engineering, said Sanchez. He noted the field consists of both neural scientists taking a bottom-up approach to gathering data and computer scientists taking a top-down approach to finding data abstractions that help them design useful systems.

The two groups look at similar problems from different perspectives that vary from the level of molecules, synapses and neurons to nerve clusters, networks and mappings.

"The challenge is to find a way to unify these various levels of abstraction," Sanchez said.

- Rick Merritt
EE Times




Article Comments - Engineers work on real-time control ...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top