Global Sources
EE Times AsiaWebsite
Stay in touch with EE Times Asia
eeBlogs-Article content Home?/?eeBlog?/?eeBlogs
Nickname:?Bernard Cole???? Articles(67)???? Visits(175722)???? Comments(18)???? Votes(73)???? RSS site editor Bernard Cole tells you about things embedded.
Blog Archive
2015?-? Oct.,?? Sep.,?? Aug.,?? Jul.,?? Jun.,?? May.,?? Feb.,?? Jan.??
2014?-? Sep.,?? Aug.,?? Jun.,?? May.,?? Apr.,?? Feb.,?? Jan.??
2013?-? Dec.,?? Sep.,?? Aug.,?? Jul.,?? Jun.,?? Mar.,?? Feb.,?? Jan.??
View All
Comment?|?Add to Favorites

Posted: 06:18:32 PM, 15/01/2014

The 8051: ARM's nemesis on IoT space?

? ?

At present, 8- and 16bit microcontrollers dominate the market for connectivity to wirelessly untethered embedded devices and sensors (i.e., the Internet of Things), and I am not convinced that 32bit MCUs, no matter how cheap they become, will ever match the ubiquity of their smaller brethren in such designs.

In this blog by Jack Ganssle, he reviews all the reasons why 32bit CPU (and ARM) dominance is likely but not certain, and warns that although ARM Ltd. is in top-dog position at this time, it may not always be so.

He outlined a number of reasons why it could be blind-sided by some unforeseen nemesis, probably an open source microcontroller architecture. One reason, he suggests, is that ARM collects a tax on each part sold, and even if all other costs were zeroed out, these devices can't compete with 8- and 16bitters in the most price sensitive applications.

I agree with him on this, and I think the challenge to ARM's dominance is most likely to occur in the context of the design challenges for adding Internet Protocol to IoT devices.

8051: IoT's dark horse ARM nemesis?
I suggest that ARM's nemesis in the IoT space could be the 8051 microcontroller, which has been "hiding in plain sight" for at least five years. Discontinued by Intel in 2007, other companies have kept it alive in various incarnations and it is now widely supported.

In the Internet of Things segment, low power requirements are forcing ARM and its 32bit MCU licencees to try out things such as near threshold voltage Cortex-M0 MCUs to allow 32bitters to compete with existing 8- and 16bit alternatives. Intel with its new Atom-based X1000 Quark SoC seems embarked on a similar strategy.

Frankly, though, these efforts remind me of American auto makers in the 1970s when the various Arab oil crises hit. Fuel efficient autos from Japan and Europe were stealing the market and all U.S auto firms could do was try to re-engineer their behemoths. They got more fuel efficient, but only at the cost of performance.

But for an alternate CPU to challenge the ARM dominance I believe it must meet several conditions:
* As Jack emphasizes, it must not be saddled with the licensing fees that ARM imposes,
* it must be well-known and used widely,
* it must already have a base of developers who at some stage became familiar with the architecture,
* if not free, it must be broadly available from a number of suppliers in chip form,
* its features and capabilities must be continually upgraded, and,
* there should be a wealth of hardware and software development tools available.

The 8051 meets all these conditions.

The first argument that is raised against my thesis is not against the 8051 alone, but against all 8- and 16bit MCUs in the IP-driven Internet of Things. Full support of the IPv6 protocol, the argument goes, requires a 32bit architecture, especially if each of the 'things' has its own unique URL identifier.

Not true. An article written for in 2003 shortly after IPv6 became available C "IPv6 on a microcontroller," by Robert Muchselillustrated that it was possible to support IPv6 on an 8bit MCU. Also, Adam Dunkels in a later article "Towards TCP/IP for wireless networks," put more nails in the coffin of the idea that only 32bit CPUs can handle the requirements of a full IP-enabled "Internet of Things" implementation.

For some IoT applications, full support of the TCP/IP stack is not necessary and often gets in the way if deterministic responses are necessary. For that, the protocol's UDP subset requires many fewer resources and does not get in the way of the MCU's real-time duties. Indeed, if you look closely at some of the implementations of the 6LoWPAN protocol added to IPv6 in 2007, they rely exclusively on UDP.

A second argument against my hypothesis I hear when I bring it up in conversations with developers is that the 8051 is an OLD architecture. In its original form, it does not have instruction set and hardware features that more modern MCUs have, not only 32bitters, but more recent 8-and 16bit devices from Microchip, Texas Instruments, Freescale, ST Micro and several Japanese offerings as well.

One thing that makes some embedded developers uncomfortable is that is uses the old style complex instruction set (CISC) architecture, diametrically counter to most of the reduced instruction set (RISC) format used on more recent MCUs. Also, the 8051 was designed using a strict Harvard architecture, so it can only execute code fetched from program memory.

Nor does it have any instruction to write to program memory, and is unable to download and directly execute new programs. But in the Internet of Things, where security will be paramount, this and the strict Harvard architecture has the advantage of making designs immune to most forms of malware.

But despite its "limitations," the 8051 is alive and well. Current vendors of MCS-51 compatible processors include more than 20 independent manufacturers. Several derivatives integrate a digital signal processor (DSP). Also, several companies offer MCS-51 derivatives as IP cores for use in FPGAs and ASICs designs.

Even if the 8051 has features that most embedded developers would disdain, that is not necessarily a barrier to its acceptance, if the recent history of Linux in embedded development can be taken as a guide. When it was originally introduced, Linux was judged by most embedded developers as totally inappropriate for microcontroller applications that require minimal memory footprint and real time, deterministic interrupt response times.

But as time went on real time enhancements were made and even more important, as noted in "The work of Linus Torvalds", Linux was (and continues to be) adopted by universities and technical schools around the world as the platform they use to introduce their engineering and computer science students to the principles of operating system design and use.

The reason: as an open source OS without commercial licensing fees, it could be used at no cost in courses as long as it is not used for commercial development. For that reason, embedded developers who learned about operating systems with Linux try to make it fit in their designs before they consider a proprietary OS.

The same is true of the 8051, which in its original form is without the licence fees associated with the ARM architecture and more than one of ARM's more recent 8- or 16bit MCU competitors. Do a search using Advanced Google and you will find that, as with Linux, in courses for engineering students on microcontroller basics, the 8051 is the platform of choice by at least a 3-to-1 ratio (my own estimate) compared to other 8- and 16bit MCUs. And when you compare its use as an MCU teaching platform in universities with that of the ARM 32bit MCUs, the ratio is even more lopsided.

Another argument is that there is no clear migration path from the 8051 to the new ARM and MIPS 32bit architectures. Not so. Many of the 8051 suppliers are also 32bit ARM (or MIPS) MCU licensees. In addition to migration paths from their own proprietary 8- and 16bit MCUs to the ARM, they have created paths for the 8051.

And don't forget the MCS-151 upgrade introduced by Intel before it abandoned the architecture. It is also supported and enhanced by a number of other chip vendors. It is a multipurpose 8/16/32bit MCU architecture with a 16MB (24bit) address space, and in its original form had a six times faster instruction cycle. It can perform as an 8bit 8051 OR as a 32bit ALU with 8/16/32bit wide data instructions.

As to tools, there again the 8051, both in its original form and in its more recent enhanced forms, seems to have a wealth of choices available from the chip vendors who continue to support it. There are several compilers and other tools available from major software tool suppliers such as IAR Systems, Keil, and Altium Tasking, who continuously release updates.

Finally, what gives me the feeling that the requirements of the various M2M and wireless applications could be the place that that the 8051 could re-emerge as a major player and candidate for Ganssle's ARM nemesis are the increasing number of articles on these topics presented at professional technical conferences and published in various journals in which the 8051 is the basic building block.

In various web surveys I have done of wireless sensor, machine to machine and IoT design activity, the 8051 ranks right up there with designs based on an ARM CPU or one of the several other 8-/16bit MCUs. Some of these articles and papers are from the same universities and technical institutes at which the 8051 is used by introductory engineering classes on microcontroller architectures.

Despite my arguments so far, I have to admit that the 8051, even in its MCS-151 enhanced configuration, is a long shot to be competitive with either the ARM 32bit Cortex M0 or any of the various more recent 8- and 16bit MCUs in the IoT space. But given the embedded industry's experience with Linux, it is not something that should be ruled out.

About the only scenario that is more of a long shot is if ARM Ltd. developed a 16bit version of its ARM Cortex-M0 MCU family based on the very CISCy Thumb 1 or Thumb 2 Instruction Set Architecture subsets it introduced to make its 32bitters more attractive to 16bit MCU developers.

On the face of it, this would be an elegantly simple solution. But who would be interested in licensing it from ARM? Certainly not any of the many ARM licencees who also have their own 8- and 16bit MCU families, the 8051, or both.


Label: ARM 8051 MCU
Views(2992) Comments(5)
Total [8] users voted ????
[Last update: 06:26:14 PM, 15/01/2014]


Visitor 12:24:16 PM, 28/01/2014
I can add a very big screen (for 8051 point of view), compile a complete OS and still having enough power for anything that I use in 8051 world.
It?s true that we still teaching about 8051. It?s simple, WE HAVE 20 YEARS OF EXPERTISE ON IT.?
When I begin to use the 8051 my teachers use more primitive cores and processors (Z80, 6500, etc) and think that 8051 was a waste of money (the tools was VERY expensive) and time.
Today I see the same argues against ARM core advocating for 8 bit cores. But I see differences ARM tools are almost free, use any languaje on it are almost free, a development board cost less tha 10$!?
Please, let us stay young, think ahead and remember what happens when we start to play with silicons. I not love to getting old with 8051, I prefer to take the new path and evolve.


Visitor 12:20:52 PM, 28/01/2014
I agree in the bare idea but even as a developer of 8051 family for 25 years I need to disagree stongly.?
I convert my old projects from 8051 to ARM last year. Why? Chip makers pay a fee for the ARM core, yes this is absolutely true, but I see now bigger cost in the 8051 derivatives than arm cost.?
For example, simply it?s by far cheaper add USB or Ethernet to a ARM than to a 8051. Off course simplest devices not require an ARM, but everybody do it in PIC and not in 8051!. I hate PIC but have a good piece of cake in simplest circuits.
Program in C withouth optimization it?s OK in ARM, this it?s not allways true in all 8051, only in devices with lot of memory and this units has a bigger cost.?
I not need to use asm to intensive loops anymore, it have a lot of memory and more important, a lot of clock cycles to use.?


WJR 03:11:15 AM, 24/01/2014

Back in the good 'ol days I used the 8051 in aerospace designs (early 80's) and later that decade in other industries. Its a great proc, easy to work with, very flexible, cheap tools, and not hard to program. I'd love to see it come back. As for IoT, I did automation and controls design for many years. In the "old" days security was not an issue. Today, with so many controls connecting to apps, the cloud, etc., the idea of a more secure architecture that is also inexpensive is very attractive. Heck, if it were to make a comeback, my old skills would become current once again - kind of like my closet of full of narrow ties. I'm feeling younger already.



BY 10:15:55 AM, 22/01/2014

Long live 8051 !



Visitor 12:23:49 PM, 21/01/2014

Certainly the 8051 is enjoying strong growth in Asia, where price rules.

The 8051 pretty much dominates the metering market, as well as the sub GHz transceiver market.

?The lowest cost Micro+USB peripheral parts, are 8 bitters.

?SiLabs offer both 8051( in 1Cycle form), and ARM, and they have chosen the 8051 for the smallest packages, and the lowest power sub GHz RF networks.

Nemisis ? - not really. The 8051 will never displace all ARMs, but it certainly has already outlasted many ARM cores & vendor variants.


Have Your Say!

Bloggers Say

Got something to say? Why not share it with other engineers?

Just introduce yourself to us, we'll contact you and set you up. Yes, it's that simple!

See what engineers like you are posting on our pages.

Interviews & Viewpoints


Learn how senior executives are seeing the industry from interviews and contributed opinions.

Back to Top