Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > Sensors/MEMS

Smart toys revel in mobility, speech and control

Posted: 01 Feb 2001 ?? ?Print Version ?Bookmark and Share

Keywords:asic? robotics? interactive device? speech recognition? multimedia?

The future of smart toys may be glimpsed in a robotic animal that is billed as "autonomous, sensitive to its environment, able to learn and mature." The latest version of the Sony Aibo, the ERS-210 represents an outrageous use of analog and mixed-signal technology. The little machine has a built-in stereo microphone and will learn and respond to about 50 voice commands. A stereo amplifier and speaker will allow it to produce tonal sounds. Sensors on its head, chin and back simulate a sense of touch, while a camera in its nose replicates sight.

Some 20 motorized joints on the dog realistically mimic walking, sitting, lying down, flapping its ears and wagging its tail. Its eyes are blinking LEDs and their expression is described as "soulful." The robot's simulated emotions include anger, fear, surprise, dislike, sadness and joy. A $90 software package allows the robot puppy to "learn"?that is, to associate these emotional responses with certain objects, sounds or user responses.

The $2,500 Sony Aibo is undoubtedly the ultimate plaything, a rich man's indulgence. But smart toys?including Furbies, Talking Barbies, radio-controlled blimps and racing cars, pocket games and video consoles?are growing up and morphing into a whole host of consumer electronic entertainment devices, boxes and consoles.

Apart from smart processors and CMOS ASIC integration, which shrinks the size and cost of the electronics, essentially three additions can be made to enhance the value of smart toys. One is speech; another is remote control; and a third is movement.

Designers who focus on speech will tell you that the attention span of any particular child for any particular toy is, unfortunately, limited. Even in these days of fading affluence, a parent doesn't want to spend $70 or $100 for a talking toy that is used only once. But the more realistic the speech capability, the more valuable the toy will be to its user?and the longer its longevity as a plaything.

Speech in Toyland nonetheless represents a paradox, according to Larry Gaddy, Marketing Manager for Winbond Electronics Corp. The big issue in a command-and-control applications environment?even a "you-talk/it-talks-back" environment?is not the processing power of the microprocessor or DSP used to decode speech, but the amount of memory. More memory means more realistic speech, but it also elevates the cost of the electronics. "For speech in toys, the cost issue is always brutal," said Gaddy.

Gaddy believes the key to conserving expensive memory usage is to build recognizers that rely on acoustic models?not word models. Word-based recognition, under the best of circumstances, is only about 95 percent accurate. Acoustic models, particularly for digits, are 99 percent accurate, he said. The problems of recognizing speech in young children, whose pronunciations can be challenging as well as cute, may affect the accuracy of these smart toys.

"Bio-toys," anyone?

Iguana Robotics Inc. have a particular slant on the kind of computations required to power a robot, whether it is used as an extension of a NASA space shuttle arm, or kid's toy. A moving device must effectively "see" where it is going and this process of relating what they see to how they move is effectively a biological relationship. The devices they have created for commercial exploitation and for university research make a point of integrated "brain-style" object recognition.

The chips use embedded biomorphic technology (EBT) algorithms derived from human biology but used to control robotic devices. Embedding these algorithms on-chip provides a more streamlined method of control than the lookup-table sequences used by microprocessors in their instruction-fetch/instruction-execute mode of operation.

Two chips serve as test pieces for EBT. One is an intelligent visual sensor that incorporates mixed-signal, "computation on read-out" (COR) that analyzes pixel data from a 128-by-64 array. The color processing circuits construct a histogram?effectively sizing these images?at a very high frame rate. It effectively performs an intelligent object recognition (albeit fuzzy) at a very high rate of speed.

A second chip, developed by Iguana with Johns Hopkins University, is a locomotion controller chip, capable of controlling the movement of a four-legged robotic toy animal. The current chip can generate the basic movement pattern for a small running biped robot.

A number of the breakthroughs in motion control, image processing and speech recognition really depend on digital signal processing. The mass-storage industry depends on DSPs to control hard-disk drive read-head positioners and thus increase disk drive density. DSPs also decode multimedia streams for digital versatile disk (DVD) players. But they also powered kids' toys like Speak-'N-Spell and the Julie doll.

? Stephan Ohr

EE Times

Article Comments - Smart toys revel in mobility, speech...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top