The stage is set for microcontrollers' big role

The stage is set for microcontrollers' big role

Summary: The tiny chips needed for the next generation of internet-enabled devices are finally getting standards, says Reinhard Keil

SHARE:
TOPICS: Tech Industry
3

A lack of standards in the tiny chips embedded in devices has hampered development and pushed up costs, but that state of affairs is changing, says Reinhard Keil.

Smart metering will soon be commonplace. Our connected homes will have washing machines and microwaves with web interfaces that can be controlled remotely by smartphone, helping to reduce energy bills and carbon footprints. Central to the realisation of this digital life are microcontrollers (MCUs) — tiny low-power processors.

But the MCU market is one of the most fragmented in the electronics industry today, with at least 100 different architectures available from more than 20 silicon vendors.

Increasing demands
Many of these 8-bit and 16-bit architectures were initiated more than 20 years ago. Consequently, most of them will not be capable of satisfying the increasing demands of future applications.

To meet connectivity needs, the industry is now developing 32-bit MCUs, which are equipped with more processing power than 8-bit and 16-bit MCUs, but use considerably less electricity.

Embedded engineers often start from scratch when developing new applications, but this practice can send the software development costs for the end-products through the roof. To keep costs down, improve product quality, enable component sharing across projects and future-proof technology innovations, some sectors use standardised platforms for hardware and software development.

Such standardisation generally achieves wide acceptance since it delivers significant benefits to the user community and reduces costs across the board.

Lack of standards
In the computer industry, before the introduction of the PC, developers had to create programs time and again for similar computing challenges and adapt existing software algorithms to new hardware since no peripheral or interface standards existed.

Yet generic software components that are now commonplace in the PC world are not yet available for MCUs, and the lack of programming standards limit software reuse. Consequently, MCU vendors have to offer free software frameworks for new devices that are then tailored towards specific applications. This situation slows down the introduction of new MCU devices and increases development costs significantly.

The PC hardware and software industry is full of standards, but, in the deeply embedded MCU market, the use of various proprietary architectures has prevented the introduction of software standards that would cut costs across the value chain.

Need for standardisation
As consumers' expectations for connected end-products grow, software development for deeply embedded applications is becoming more and more expensive, increasing the need for standardisation.

And there are signs that this standardisation is starting to happen. Many of these next-generation MCUs, set to become the industry standard in the coming years, are based on the low-power and low-cost 32-bit ARM Cortex-M series processors.

Third-party intellectual property of this type helps MCU makers bring down development costs — and using the same architecture for different markets also reduces development overheads significantly.

Consistent software
On top of that, to achieve a consistent software platform and cut costs further, ARM and members of the ARM ecosystem — including Atmel, IAR Systems, Luminary Micro, Micrium, NXP, Segger and STMicroelectronics — have introduced the Cortex Microcontroller Software Interface Standard, or CMSIS, which enables silicon vendors and middleware providers to create software that can be easily integrated.

That collaboration has resulted in a programming interface for Cortex-M processor-based devices, which is easy to learn and use.

There are 23 licensees for the Cortex-M3 and 15 for the Cortex-M0, the smallest and lowest-power MCU core in the market. The widespread adoption of the Cortex-M processor family means that the industry is moving towards a more standardised and future-proofed approach to MCU development.

This shift will facilitate hardware and software development, reduce device costs and power consumption, and drive innovation for the technologies at the heart of the next generation of digital life.

Reinhard Keil is director of microcontroller tools at microprocessor design company ARM.

Topic: Tech Industry

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

3 comments
Log in or register to join the discussion
  • Standardization is helpful..

    and a must so can't fault their, but do we really need remote controlled toaster's.

    I wonder if ARM will ever enter into the desktop market, and if they did would it be RISC or CISC based.
    CA-aba1d
  • Actually...

    ARM came from the desktop market :)

    Acorn
    Risc
    Machine


    They are also finally nibbilng at the netbook market... possibly!
    Tezzer-5cae2
  • Oh aye..

    The Acorn 3000 didn't realize it was an ARM chip like, knew it was cisc based, mind you back then I was a commodore chap, but even then when they brought out the 32bit machines, they where way stronger than Amiga A1200's once they finally came out.

    Yup many a fond memory's with the Acorns, they where decent machines, just lack lustered on software a little bit.

    It would be great to see ARM bring an beefy multicore chip out. =D
    CA-aba1d