The IoT and the return of 8-bit computing

It's time to brush off those 1980s programming skills, if we're to have a cheap and flexible Internet of Things.

The BBC Micro and ZX Spectrum. Images: Public domain/Gamespot
Back in the 1980s we all used 8-bit computers, getting every bit of horsepower out of our BBC Micros, our Apple IIs, and our Commodore 64s. Then came 16-bit, swiftly followed by 32-bit, and today's 64-bit multi-core processors. Compared to a 32KB BBC Micro or a 48KB ZX Spectrum, today's quad-core Core i7s with 16GB of RAM give us a lot more space to write our code.

That's a problem when it comes to the Internet of Things (IoT). We're not going to put a modern PC in every device out there -- they're too expensive and far too power hungry. Even phone hardware will be too expensive; while Microsoft made its low-cost Lumias virtual shields for Arduino devices, at over $40 they're still too expensive to put everywhere. So what will power the IoT?

Critical IoT security flaw leaves connected home devices vulnerable

UPDATED. IoT devices offered by firms ranging from Samsung to Phillips may be vulnerable to exploit and hijacking.

Read More

For a look at what we're going to need, we'll have to go back in time a few decades, back to when I was a teenager and the birth of the first personal computers -- simple devices with low-power processors and very little memory.

My first code was written on a KIM-1, an early 8-bit single board computer with just 1KB of memory, and 2KB of firmware in EPROM. There was no high-level language, just a hexadecimal calculator keypad and a single-line LED numeric display. You had to write and compile code by hand, before tapping it into the keyboard (and woe-betide you if you made a mistake!).

When we think of the IoT we usually think of the familiar devices around us -- the PCs and smartphones that form the backbone of our digital lives. That's a mistake. The devices that go into our lightbulbs, door locks, and the arrays of sensors we're building are much more like that single-board KIM-1. They're not smart as we think of smart: they're devices that don't have operating systems, have very little processing power, and even less memory.

Back to the future

That means we're going to need to go back in time, and bring back skills developed on old hardware. Code for the IoT will need to be designed to take advantage of low-power, low-cost devices. You won't have a lot of memory, and you probably won't have an operating system -- just firmware.

Working in constrained environments is very different from working with a set of modern development tools. There isn't the memory space to install debuggers or run monitor code. You're going to need to be prepared to write code, build and test by hand. You'll be going back to the future.

CodeBug is a programmable wearable device with a 5x5 red LED display and touch-sensitive inputs, powered by a watch battery. Image: Codebug
That doesn't mean leaving everything we've learned behind. You can get a feel for how developing for an IoT platform might feel by exploring some of the educational hardware out there. Take, for example, the CodeBug. Built around an 8-bit PIC micro-controller, it's a simple platform for building basic wearable electronics, mixing sensors with an LED-matrix display. Fire up the online development tools, and a visual programming environment makes it easy to quickly build a new firmware and test it on a browser-hosted simulator.

Once ready to run, your code is compiled, and you can download it and install it in the CodeBug's 40KB of non-volatile memory -- it runs as soon as the device is reset. There's no messing about with operating systems or with networks. It's all as old-school as it gets, although here you're writing code in a browser rather than tapping it on a set of calculator keys.

As small as it is, CodeBug is still relatively inefficient. The code we put in a lightbulb or in a door bell needs to take advantage of its PIC processor -- or even smaller devices. A clockless ARM M0 needs you to go even deeper into the place where hardware and software meet, writing code that takes as little space as possible and, more importantly, as few cycles as possible. That goes further for devices that are intended to work with batteries, where you want to eke out as much battery life as possible. Using those old 8-bit skills also makes it easier to extend the life of your hardware, with simple low-bandwidth firmware updates -- if updates are needed at all.

Dumbing down the IoT

While there's a lot to be said for devices like the Raspberry Pi and Arduino as prototypes for the IoT, they're still vastly over-powered for most scenarios. There's absolutely no need for even an embedded OS on IoT hardware, as it adds complexity and cost to a device. Instead we should rely on physical hardware for much of what we do now in software. Need a wireless connection? Use a wireless SoC with a built-in TCP/IP stack, so all you have to do is deliver a signal, and it will do the rest -- over Wi-fi or Bluetooth.

We talk a lot about smart devices, but the key to an effective and affordable IoT world isn't making things smart: it's keeping things dumb. That way we keep out the complexities of updating hardware to a minimum, at the same time as maximising the life of our hardware and keeping costs down. With dumb connected devices we can move the smarts further up the stack, into gateways and on into the cloud.

That's why working with 8-bit devices makes sense. They consume very little power, they have simple memory management schemes, and they're easy to manufacture cheaply. The only problem is regaining the skills needed to program memory-poor low-power hardware. If you worked on embedded systems in the 1980s and early 90s, maybe it's time to brush off that resumé, as those old skills could be worth a lot more than you think.

Read more about Internet of Things


You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
See All
See All