Building cloud services for a thin-client world

Can the problems users are having with Windows 10's OneDrive changes tell us anything about how we need to build cloud applications?

istock-file-hand.jpg
Image: iStock/Igor Stevanovic
If you thought Windows 8.1 was as much about the cloud as it was about your PC, then you won't be surprised by the emphasis on cloud services in Windows 10, with the launch of Cortana and a new update process. It's a connected Windows -- one that's diminished when away from Ethernet or wi-fi, and accentuated when you're using it on more than one machine, where settings flow from one setup to another.

But you might be surprised by one big change in the way Windows 10 works with the cloud. There's a new version of the Windows OneDrive sync engine that, as noted by my colleague Mary Branscombe, is missing many of the cloud-centric features of Windows 8.1's OneDrive -- particularly its ability to show cloud-hosted directories files in your file system, even when you didn't have a local copy.

While a new version of OneDrive which should bring back some of these missing elements is due in one of the many Windows 10 updates that will be delivered over the next few months, it's worth considering whether those changes could have been avoided. Certainly there were issues with placeholder files on small devices, where a busy OneDrive could overwhelm their storage. Then there were the differences between two similarly named services that worked in very different ways -- OneDrive for Business building on SharePoint and Groove, and the consumer OneDrive blending features from Windows Live Folders and Live Mesh.

Making changes has certainly given Microsoft the opportunity to merge two very different cloud sync engines, bringing OneDrive and OneDrive for Business together. But right now, the resulting service is a step backwards, dropping the ability to see not just the files and directory structure you stored in the cloud from a client PC, but also to save into the cloud store directly. Those were tools that people had built workflows around, and moving that cheese really did cause them issues.

It didn't affect me, as I'd been working with those tools in a very different way, only using the cloud as a way of managing files that synced from a master files system on my desktop. For me, OneDrive is simply a live copy of my desktop PC's file system, and I just keep a working set of files on laptops and other machines. Outside that limited set of directories, everything is managed on and by my desktop.

Mary works very differently, using only a laptop. Instead of my 3TB disk drive, she's got just 128GB of SSD. That's not going to manage more than 14 years of writing and email, let alone the apps needed to deal with it all.

Comparing our workflows made me think: Microsoft developers tend to have high-end desktop machines and hefty laptops. They're not using the services they're building on small devices or systems with limited storage. That means they're likely to have workflows more like mine than Mary's.

So here's a suggestion for anyone building cloud services: why not make the product management team all use low-cost devices with minimal storage? You're going to get a very different view of how you'll use the cloud if you're using an entry-level Surface 3 or a budget Chromebook as your daily driver. Suddenly the cloud becomes the master copy of all your files and all your data.

Inverse Conway required

In the past I've written about the concept of Conway's Law, coined in 1968 by Melvin Conway, which states that "organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations" (often shortened to the more succinct "don't ship your org chart"). But Conway's Law has an inverse: that by designing software to drive specific ways of working, we can change the way a company works.

Implementing an 'Inverse Conway' can be immensely painful, and it's an approach that fails more often than not. But when it succeeds, it can rewrite corporate DNA in a way that no other technique can. Here, however, we need to implement it in order to deliver the type of service that end users want.

There's a huge difference between a developer's desktop PC, with several terabytes of disk storage and a hefty Core i7 processor, and a consumer tablet that's using 16GB of eMMC flash and a low-power Atom chip. Software designed and developed on the former needs to be tailored to the needs of the latter -- and to the use cases that come with the smaller device.

With a desktop PC and plenty of storage, you're far more likely to think of your PC as the master source of all your data. It's where everything lives, and the cloud is just a backup and a sync hub that links to other devices. You don't need to know what's in the cloud, as it's all on your PC. Meanwhile, secondary devices subscribe to the files and folders you want to share with them -- files and folders that can be managed from the master PC.

That model is turned on its head for small devices. With so little local storage, the cloud becomes the repository for all files, with only a subset held on the device itself. That means users need a way of seeing all their files -- even if they don't have them on their PC or tablet. File management becomes as much about working with cloud-hosted content as local files, and cloud files need to have the same visibility as their locally synchronized copies.

It's a model that's completely different to that encouraged by a large desktop PC. Now that the cloud storage wars mean that a terabyte or so of cloud storage costs virtually nothing, it's also one that's becoming increasingly common.

Cloud development needs to be a variant of the inverse Conway's Law. Developers who look at the world from their PCs need to look at it from the cloud instead. Only then will they be able to deliver the type of services their users actually want -- not the services the developers think they want. It's a way of thinking that's going to be critically important over the next decade.