It's a term we're hearing a lot, as device and OS vendors start to move away from the old metaphors of the desktop computing world, as we switch to one that's shaped by ubiquitous computing and the power of the artificial intelligences that run in the cloud.
So what is "authentically digital"? It's not just the flat design style we've come to expect from Microsoft Apple, and Google; it's also the new ways we do things on our devices.
It's certainly a change that's been needed for a long time, possibly since the birth of the modern personal computing model with the early graphical user interface in the Palo Alto research labs of Xerox PARC. Ted Nelson, the man who did much of the early work on hypertext, wrote a damning critique of the way we use computers - noting the corrosive effects of the paper and file metaphors on hypertext. Nelson made an interesting point, that by clinging to a set of metaphors we are limited in the choices we make.
Treating a PC as a desktop or a tablet as a slate constrains the metaphors around the device. What do we do with a desktop? We put things on it. What do we do with a slate? We take quick notes or we read. The screens around us have powerful processors behind them; they shouldn't be limited by older ways of working and of thinking.
Being authentically digital is about going beyond the desktop, beyond the slate, and understanding the new ways people are using their devices - and how they're being limited by the ways they work.
The part of London where I live is being significantly redeveloped, with 1970's office blocks being demolished or reworked to turn them into massive apartment blocks. It's been fascinating watching the engineers and surveyors work, seeing how they use the technology they have. One of the more important tools in their pockets is the smartphone, and the better the camera the more use it gets. Instead of taking notes, they're taking photographs and quickly sharing them with co-workers.
It's a workflow that's being shoehorned into existing technologies: photos are being stored in those same old folders, and they're being sent by email. It's useful, if not particularly efficient. What's needed is something that can take those images, use the metadata that's been captured with them (for example its GPS location, the time, and the camera angle) and automatically build and share a report.
We're starting to see some of the tools that are needed come together. Lumia Storyteller uses GPS data and time to build a narrative around your cameraphone pictures - grouping them by time and place, and letting you explore images in the context of a map. Then there's the Office team's new app, Sway, which lets you collate and curate digital content, turning it into a navigable hypertext document.
Put the two together and you've got a new digital workflow, one that automates the creation of reports from visual information. But that's only the start of what we could do with our devices, and with our content.
We're already starting to use elements of machine learning in our day-to-day lives, with cloud scale AIs adding context to our device interactions. Both Google Now and Microsoft's Cortana are able to use location as a tool for adding context to a query - so what if we could use that context in a digital workflow?
It's easy to imagine a near future hybrid of Storyteller and Sway, where our building site surveyor is photographing work in progress on a building retro-fit. He takes a series of photographs, which are automatically wrapped as a report using real-time speech recognition to convert his spoken notes into captions. As soon as he leaves the site, the report is sent to his colleagues and to his client.
This isn't science fiction, it's something we could do today - at least as a research project - and certainly something that could become a product in the next couple of years. We've built the context engines that can be given simple natural language rules. So we're not far from being able to take saying "Hey Cortana, send a report of all the pictures I take this morning at 120 The High Street to John Smith at Smith Building Engineers", and turn it into the rules that drive an automated report generator.
Throw in a version of the Skype Translator as a transcription engine, and you've got a whole new workflow that blends device and cloud to deliver contextual information where it's meant to go, in a readable format. Add another layer in the information from the meshed networks of internet of things sensors, and you've started to build something that's truly different from the old ways of working. A photograph of a crack in a wall could be matched with data from stress sensors; taking it from a case of "patch now" to one of "urgently brace and rebuild".
In this context, "authentically digital" means "goodbye paperwork". You can quickly imagine how revolutionary these new workflows can be. No more police spending more time in the station than on the beat; arrest records and reports are generated on the fly as the event happens (so no more relying on fallible memories either). The same in hospital, doctors will be able to gather digital patient health data from sensors, add dictated notes and quickly have the information attached to the patient records.
Microsoft CEO Satya Nadella talks about a world of "ubiquitous computing and ambient intelligence". It's a very different world from one where PCs live on desks, and pretend to be paper in decades-old workflows. We need to rethink the way we work and the way we use computers, building the authentically digital workflows that are going to power the next three decades of business.
Don't expect change overnight - we're still only feeling our way into the world of digital work, slowly finding new things we can do and new ways of working. But we're starting to get a picture of what it could be like - and what it will mean to be truly authentically digital.