Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > EDA/IP

When GIGO meets analog

Posted: 01 May 2007 ?? ?Print Version ?Bookmark and Share

Keywords:garbage in garbage out and analog? GIGO concept? instrumentation-grade system?

Not too long ago, the acronym GIGO (garbage in, garbage out) was fairly commonplace. It neatly summarized that no matter how powerful your computer, no matter how good your algorithms, the output result would only be as good as the input data. If your data analysis calculated the current through the sensing resistor in your MP3 player to be 47.1235A based on your test readings, you must have two problems: excess and meaningless precision and a badly misplaced decimal point.

The GIGO concept seemed to fade a few years ago. After all, processors were getting very powerful, and data collection and analysis software were improving and including better bounds checking. Thus, many problems were trapped and flagged early in the analysis cycle.

The trend of using faster processors and advanced software to overcome inherent system defects also spread to basic analog-channel design. It didn't matter that the basic analog component (voltage reference, amplifier and ADC) was not perfect. Just take that pretty good but not great reference, and calibrate it at the factory across various temperature and supply-rail values, then store correction values in memory. The broad availability of flash memory made this approach all too easy. The theory was that you could use a lower-cost, moderate-performance analog part and the virtually free memory for a net saving.

In the past few years, though, I have seen a trend away from this digital correction approach for several reasons, despite its apparent attractiveness. First, it takes production time and specialized test equipment. Second, it marries the calibration of a unique analog component with the product, until "death do you part". Unfortunately, such death happens, requiring a factory- or depot-level recalibration.

But the other reasons to avoid this approach are both tangible and philosophical. First, analog vendors have pushed the price and performance factors even harder, and come out with significantly better components. In the past year, I have seen an uptick in the number of low-power, low-cost, higher-performance op amps, in amps, voltage references and converters from most of the top analog vendors.

At the same time, designers realized that the digitally calibrated approach does not make sense for mainstream products. It may be the right thing to do for a calibration lab or instrumentation-grade system (for example, to suppress any second- and third-order errors due to drift aging and other ills). But it is a production time- and compute-intensive solution that assumes you can clearly anticipate all the dimensions of future error sources (temperature, humidity, supplies, signal noise, vibration, stress and many other factors), and calibrate against them in advance. Worse, it often doesn't help when the problem is internal component noise or external system noise.

Instead, engineers now see that it's generally wiser to choose better components so that system accuracy and overall performance are assured by design, analysis and guaranteed specifications, and not try to bypass reality by applying more external correction factors. GIGO is still a very valid summary of the real world signal-processing chain.

- Bill Schweber
Site Editor, Planet Analog

Article Comments - When GIGO meets analog
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top