X
Tech

Wearable devices to usher in context-aware computing

In this guest post, Joe Burton, CTO at Plantronics, lays out a vision for intelligent wearable devices and sensors that will redefine relevance and greatly simplify and automate the lives of users.
Written by Chris Jablonski, Inactive

Headset maker Plantronics is embedding smart sensors into its devices and inviting developers to build line-of-business applications that are contextually aware, incorporating presence, availability, proximity, and caller information. In this guest perspective, Joe Burton, CTO at Plantronics, describes the transformation context-awareness could unleash.

joeburton.jpg

Joe Burton, CTO, Plantronics

We’ve entered a new era of computing that promises to provide users with a rich and seamless experience across all of their connected devices. Up to now, humans interacting with their computing environments have depended on keystroke instructions or a mouse click to perform a task. Now, smart devices will be able to monitor and even anticipate behavior. They will automatically perform what needs to be done or make a user’s environment more personal. Welcome to the world of context-aware computing.

In this new era, emerging applications will integrate and interoperate with the smart, connected sensing objects around them, and garner data about what the user is doing and how they are doing it. Data such as a user’s location, presence, devices in range, and soft sensing data, including preferences and social networks, can be harnessed by new applications via sensing capabilities embedded in smart wearable devices. Contextual data can be derived from any number of systems or connected devices, including a user's GPS coordinates, search query logs, or the transaction history of an account. Using this information, smart applications, which will rely on increasingly complex algorithms to understand and predict the environments, will proactively perform automated tasks that can enrich or personalize the user’s experience by removing friction points in day-to-day activities.

As an example, let’s look at the interaction between a user’s sensor-enabled headset and his smartphone.  Our scenario begins with a person having lunch at their favorite restaurant. It is noon and the restaurant is particularly popular and noisy. When a call comes in for the user, the headset knows the location and the degree of ambient sound in the room and will automatically adjust the noise reduction algorithm to enhance the caller’s experience and automatically increase the headset volume for the user. Context-aware applications will also be able to predict or infer a user’s intentions by detecting or interpreting the environment. Because a smartphone knows that a user is having a conversation, it could withhold incoming calls, for instance.

Among the new wearable devices is a new generation of smart headsets. Taking the incredible processing power that is now available in small, low power-consuming form factors and combining it with flexible sensor technology, headsets can deliver information about a user’s actions and preferences. By exposing the smart headset capability through an API, developers can integrate contextual information contained within the headset such as caller ID, proximity, and wear state to enhance presence and availability within their applications.

The potential for these new context-aware applications is huge. Gartner projects within a few years, we will be spending $96 billion on applications that are contextually aware. In fact, the analyst firm expects that by 2015, “context will be as influential in mobile consumer services and relationships as search engines are to the Web.”

There are countless applications for business in the office or out in the field. For example, when a call comes in on a user’s smartphone, a smart headset would be able to capture the caller’s number and propagate the information to an application that is running on the user’s laptop. The application would use the contextual data to automatically pop up information about the caller in order to prepare the user for the call even before saying hello. This type of application can strengthen a customer relationship or improve an opportunity with a new business prospect. In a contact center environment, call routing functions can be more intelligently routed based on the availability of subject matter experts.

Context-aware computing makes it possible to give employees the right information they need to make a particular decision or take the appropriate action. Business applications would be able to take advantage of the contextual information provided by the smart objects around the user to search for and retrieve background information on participants coming together a video conference and even send documents to the team to be reviewed before the meeting even takes place.

Many vertical industries will also benefit from context-aware computing, healthcare being a primary one. Imagine a situation where a doctor visits a patient in a hospital room. A smart device the doctor is wearing can turn on the doctor’s workstation in the room, then authenticate the doctor to the patient management system, detect which patient is near the doctor, and finally pull up the patient’s record. When the doctor leaves the room, all the information is saved and the doctor’s workstation powers down.

These are just some of the ways context-aware computing is going to change how we interact with our devices. It’s fair to say, your information devices soon will have your back.

Editorial standards