There's a conversation I've been having with many different people over the years. It revolves around trying to understand how we use context to make IT easier to consume. We keep approaching answers from different directions, from the worlds of search, of knowledge management, of business analytics, from the smartphone platforms, to the tools that power web search. The aim is simple – how do we make sure the right people get the right piece of information at the right time to make the right decision? Call it "right time information" for want of a better phrase.
Science fiction writer friends of mine think of it as "intelligence amplification", tools that make us smarter. There's an aspect of that in our relationship with our smartphones. I often think of mine as a "memory prosthesis", a note taking and search tool in my pocket. Instead of remembering where the pictures hung on a wall we recently painted, I just photographed then with my iPhone and uploaded them into the cloud using Evernote. How often do you quickly look up something on Google or Wikipedia or IMDB in the middle of a conversation? Getting that little bit of an intelligence boost is a big win, especially with such a small device.
But there's also a downside. How many telephone numbers do you remember these days? Or do you (like me) just look them up in your phone's copious address book? And do you find yourself missing that pocket intelligence booster when confronted by rapacious data roaming charges?
BlackBerry Super Apps take this one step further, blending information from searches across the phone's many applications to give a coherent look at your world – while pulling in relevant real time information to add an extra spin (like information from the upcoming Dash-powered ETA API which uses real-time and predicted traffic information to let you and your apps know just when you're likely to arrive at a destination).
That's all very fine in a world where information comes from searches, but things are changing. To steal a phrase from Barak Hachamov, the founder of My6sense, we're moving from a web of documents to a web of streams – where everything gets pushed to us. Like the old web, that's a mix of good and bad. Information will become more and more real time, but it will also be harder to extract the signal from the noise – and harder still to determine just what's going to be both relevant and useful in the future. Starting with yesterday's RSS feeds and today's social network status tools, the web of streams is getting more and more complex and more and more integrated into the world.
Barak was the latest person I've had this conversation with, over a coffee in London when he was in town launching a version of his company's app in conjunction with business social network Ecademy.
My6sense's iPhone application hopes to provide users with a way of getting to grips with the web of streams, with the aim of learning just what's relevant to its users, and presenting a filtered view based on a mix of content and of trusted sources (along with a little dose of serendipity to make sure you don't get trapped in an echo chamber of like minds). Barak calls it a first step on the way to "devices that bring you information without you asking for it". It's something that needs to be effortless, and needs to be based on your behaviours – understanding your intentions based on hat you pay attention to. The resulting digital model "you" is build based on implicit feedback from what you click and how much time you spend (and when you spend it). It's what Barak describes as a "human ranking function", in comparison to Google's page rank, but there's no one human rank – it's going to be different for each one of us, which is why My6sense follows and mimics the process of association.
It starts with content, using combinations of words to infer relationships between elements, before adding in context (time and location and the like) and community relevance (itself a hard thing to infer as communities are dynamic and ephemeral). The result is something Barak calls an "Intuition Box" for every user, which aims to make you more productive and more efficient, finding what is important to you and delivering it – including things you wouldn't have found any other way. It's not something that happens overnight, and like all good knowledge-based AI systems it takes time to train My6sense to work with your personal feeds.
There's something fascinating under the covers here. The iPhone app is an interesting stealth route to building a set of both general intuition rules and specific class-based rules that can bootstrap the process of getting personalised information streams up and running more and more quickly. But that's only part of the story. What can you do with a contextual, right-time, digital model of me? If you're a Google, it gives you a better way of targeting advertising; testing out what I'll pay attention to before you even deliver it to my screens. If you're a Microsoft or a RIM, it's a way of knowing just how will be most effective to deliver reports and information so I respond to them in the most productive way. Give it an API and it can do more than just scan my choice of feeds, it becomes a pre-emptive research engine, using its knowledge of what I'm interested in and working on to deliver me the information I need right now.
My6sense's tools are one of the engines that'll power that right time world. Talking to Barak is a glimpse into a future that's turning out to be just around the corner – and on our phones.
A digital me in the cloud, making me smarter and more informed? Sign me up right now!