The IoT and the return of 8-bit computing


That's a problem when it comes to the Internet of Things (IoT). We're not going to put a modern PC in every device out there -- they're too expensive and far too power hungry. Even phone hardware will be too expensive; while Microsoft made its low-cost Lumias virtual shields for Arduino devices, at over $40 they're still too expensive to put everywhere. So what will power the IoT?
For a look at what we're going to need, we'll have to go back in time a few decades, back to when I was a teenager and the birth of the first personal computers -- simple devices with low-power processors and very little memory.
My first code was written on a KIM-1, an early 8-bit single board computer with just 1KB of memory, and 2KB of firmware in EPROM. There was no high-level language, just a hexadecimal calculator keypad and a single-line LED numeric display. You had to write and compile code by hand, before tapping it into the keyboard (and woe-betide you if you made a mistake!).
When we think of the IoT we usually think of the familiar devices around us -- the PCs and smartphones that form the backbone of our digital lives. That's a mistake. The devices that go into our lightbulbs, door locks, and the arrays of sensors we're building are much more like that single-board KIM-1. They're not smart as we think of smart: they're devices that don't have operating systems, have very little processing power, and even less memory.
Back to the future
That means we're going to need to go back in time, and bring back skills developed on old hardware. Code for the IoT will need to be designed to take advantage of low-power, low-cost devices. You won't have a lot of memory, and you probably won't have an operating system -- just firmware.
Working in constrained environments is very different from working with a set of modern development tools. There isn't the memory space to install debuggers or run monitor code. You're going to need to be prepared to write code, build and test by hand. You'll be going back to the future.
Once ready to run, your code is compiled, and you can download it and install it in the CodeBug's 40KB of non-volatile memory -- it runs as soon as the device is reset. There's no messing about with operating systems or with networks. It's all as old-school as it gets, although here you're writing code in a browser rather than tapping it on a set of calculator keys.
As small as it is, CodeBug is still relatively inefficient. The code we put in a lightbulb or in a door bell needs to take advantage of its PIC processor -- or even smaller devices. A clockless ARM M0 needs you to go even deeper into the place where hardware and software meet, writing code that takes as little space as possible and, more importantly, as few cycles as possible. That goes further for devices that are intended to work with batteries, where you want to eke out as much battery life as possible. Using those old 8-bit skills also makes it easier to extend the life of your hardware, with simple low-bandwidth firmware updates -- if updates are needed at all.
Dumbing down the IoT
While there's a lot to be said for devices like the Raspberry Pi and Arduino as prototypes for the IoT, they're still vastly over-powered for most scenarios. There's absolutely no need for even an embedded OS on IoT hardware, as it adds complexity and cost to a device. Instead we should rely on physical hardware for much of what we do now in software. Need a wireless connection? Use a wireless SoC with a built-in TCP/IP stack, so all you have to do is deliver a signal, and it will do the rest -- over Wi-fi or Bluetooth.
We talk a lot about smart devices, but the key to an effective and affordable IoT world isn't making things smart: it's keeping things dumb. That way we keep out the complexities of updating hardware to a minimum, at the same time as maximising the life of our hardware and keeping costs down. With dumb connected devices we can move the smarts further up the stack, into gateways and on into the cloud.
That's why working with 8-bit devices makes sense. They consume very little power, they have simple memory management schemes, and they're easy to manufacture cheaply. The only problem is regaining the skills needed to program memory-poor low-power hardware. If you worked on embedded systems in the 1980s and early 90s, maybe it's time to brush off that resumé, as those old skills could be worth a lot more than you think.