Identifying the most favored embedded tool
Keywords:embedded tool? embedded systems? embedded survey market?
I vote for the oscilloscope. It's the universal electronic engineering tool, beloved by hardware and software people. It's also the most underused tool in the firmware realm.
A recent Embedded Market Study conducted by TechInsights showed that just 37 percent of us rated the scope as their "favorite/most important hardware/software tool." Number one at 55 percent is the compiler and assembler.
"Favorite" and "most important" probably don't go well together. The compiler is certainly one of the most important tools we use since it's impossible to build any bit of firmware without it. But it's hardly my favorite. In fact, the ideal compiler should be non-intrusive and do its quite boring work well that we should barely notice it's there. It's like a bilge pump that silently and constantly does a hugely important job but is unnoticed and under-appreciated.
![]() |
Which of the following are your favorite/most important software/hardware tools? (Click to view full image) |
Why did the scope fair so poorly? According to Electronics.ca, the global market for scopes will be about $1.25 billion in 2010. That's pretty small compared with many categories of electronics products (microprocessors, for instance) but huge for the embedded systems tools market.
Transformation
Various studies peg the entire embedded tool market at around $3 billion, a number I just don't believe. Often analysts inscrutably lump RTOS sales in their tool figures, which makes little sense. But summing up sales of the biggest players in the industry, with real-time operating systems and all, gives a number of well under $1 billion. So $1.25 billion in oscilloscope sales overwhelms the entire size of the embedded tool market. One would think a large percentage of those scope sales would be to us, the embedded hardware and software designers.
This industry went through a radical tools transformation over the last 15 years. In the 1980s and 1990s, most of us used in-circuit emulators (ICEs). These tools let us debug in the procedural domainsingle step, examine variables and so forthas well as the time domain. The ICE supported the latter via real-time trace, performance analysis, timers and other features essential to managing microseconds.
But processor speeds increased to rates that made it impractical to bring signals to an ICE pod. Surface mount technology shrank packages to sizes that couldn't be probed. And sophisticated on-chip features like caches, pipelines and MMUs removed any correlation between what the CPU was doing and the signals on the pins. The ICE market all but disappeared, though a few companies, for instance Lauterbach and Signum, continue to provide such tools.
BDM and JTAG debuggers replaced the ICE in most development shops. Cheap and easy to set up, these devices used logic on the target CPU to move debugging data to a PC. They worked regardless of target clock rate and used a simple dedicated connector, which ameliorated all of the surface mount issues.
But BDM and JTAG debuggers provided nothing for handling the time domain. They gave us a Visual Studio-like interface. Scopes, therefore, became much more important. Twiddle a bit in the code and the scope will instantly show execution time, latency and pretty much anything else you need to see.
The good news is that the BDM/JTAG tools have improved. More vendors are throwing transistors at the debugging problem and add all sorts of wonderful on-chip resources like hardware breakpoints, trace and more. Real-time debugging returned.
Ironically, people involved with designing CPUs tell me they are being squeezed by management to remove as many of these capabilities as possible. The boss wants to reduce transistor counts, or devote them to cranking out more performance, or to add additional peripherals. Just as in Jules Verne's From the Earth to the Moon in which there was an essential tension between armor makers and canon builders, in the embedded world there's a never-ending tension between debugging resources and reducing the cost of goods. Alas, few follow the implications of that debate to the logical conclusion: cruddier debuggers lead inescapably to higher engineering costs. Higher engineering costs drive the sell-price up, since those NRE dollars must be amortized over the number of units shipped. At least, if one wishes to make a profit over the long term. But perhaps long-term thinking is merely a quaint notion that has grown obsolete.
I suspect this tension between chipmakers and developers is part of the fabric of the marketplace and will never go away. That's a shame, but it's also an opportunity for other types of clever add-on tools. For instance, augment your scope with Micrium's ?C/Probe, one of the most interesting tools for developers of real-time systems that I have seen in some time. It links a bit of communications code in your embedded system to a PC-hosted app that is aware of your ELF or similar debugging file. Drag a handful of widgets onto the screen (such as meters and spreadsheets) and link them to variables in your code. ?C/Probe periodically samples these variables and updates the screen. It's pretty cool to see a virtual analog gauge display, say, system idle time.
Looking at the study data's results on debugging tools in more depth, we see 53 percent of us call the debugger our favorite/most important tool. Presumably "the debugger" is the software application that we use to drive the hardware tools. I agree that this is a hugely important tool. Most projects consume 50 percent of the development effort in debug and test, so we're slaving away in front of this application for months or years on end. Important? You betcha. Favorite? I sure hope so, and hope it is well designed it seamlessly enables our work and doesn't hinder our efforts.
Related Articles | Editor's Choice |
Visit Asia Webinars to learn about the latest in technology and get practical design tips.