Wearable devices to usher in context-aware computing

Wearable devices to usher in context-aware computing

Summary: In this guest post, Joe Burton, CTO at Plantronics, lays out a vision for intelligent wearable devices and sensors that will redefine relevance and greatly simplify and automate the lives of users.

SHARE:

Headset maker Plantronics is embedding smart sensors into its devices and inviting developers to build line-of-business applications that are contextually aware, incorporating presence, availability, proximity, and caller information. In this guest perspective, Joe Burton, CTO at Plantronics, describes the transformation context-awareness could unleash.

Joe Burton, CTO, Plantronics

We’ve entered a new era of computing that promises to provide users with a rich and seamless experience across all of their connected devices. Up to now, humans interacting with their computing environments have depended on keystroke instructions or a mouse click to perform a task. Now, smart devices will be able to monitor and even anticipate behavior. They will automatically perform what needs to be done or make a user’s environment more personal. Welcome to the world of context-aware computing.

In this new era, emerging applications will integrate and interoperate with the smart, connected sensing objects around them, and garner data about what the user is doing and how they are doing it. Data such as a user’s location, presence, devices in range, and soft sensing data, including preferences and social networks, can be harnessed by new applications via sensing capabilities embedded in smart wearable devices. Contextual data can be derived from any number of systems or connected devices, including a user's GPS coordinates, search query logs, or the transaction history of an account. Using this information, smart applications, which will rely on increasingly complex algorithms to understand and predict the environments, will proactively perform automated tasks that can enrich or personalize the user’s experience by removing friction points in day-to-day activities.

As an example, let’s look at the interaction between a user’s sensor-enabled headset and his smartphone.  Our scenario begins with a person having lunch at their favorite restaurant. It is noon and the restaurant is particularly popular and noisy. When a call comes in for the user, the headset knows the location and the degree of ambient sound in the room and will automatically adjust the noise reduction algorithm to enhance the caller’s experience and automatically increase the headset volume for the user. Context-aware applications will also be able to predict or infer a user’s intentions by detecting or interpreting the environment. Because a smartphone knows that a user is having a conversation, it could withhold incoming calls, for instance.

Among the new wearable devices is a new generation of smart headsets. Taking the incredible processing power that is now available in small, low power-consuming form factors and combining it with flexible sensor technology, headsets can deliver information about a user’s actions and preferences. By exposing the smart headset capability through an API, developers can integrate contextual information contained within the headset such as caller ID, proximity, and wear state to enhance presence and availability within their applications.

The potential for these new context-aware applications is huge. Gartner projects within a few years, we will be spending $96 billion on applications that are contextually aware. In fact, the analyst firm expects that by 2015, “context will be as influential in mobile consumer services and relationships as search engines are to the Web.”

There are countless applications for business in the office or out in the field. For example, when a call comes in on a user’s smartphone, a smart headset would be able to capture the caller’s number and propagate the information to an application that is running on the user’s laptop. The application would use the contextual data to automatically pop up information about the caller in order to prepare the user for the call even before saying hello. This type of application can strengthen a customer relationship or improve an opportunity with a new business prospect. In a contact center environment, call routing functions can be more intelligently routed based on the availability of subject matter experts.

Context-aware computing makes it possible to give employees the right information they need to make a particular decision or take the appropriate action. Business applications would be able to take advantage of the contextual information provided by the smart objects around the user to search for and retrieve background information on participants coming together a video conference and even send documents to the team to be reviewed before the meeting even takes place.

Many vertical industries will also benefit from context-aware computing, healthcare being a primary one. Imagine a situation where a doctor visits a patient in a hospital room. A smart device the doctor is wearing can turn on the doctor’s workstation in the room, then authenticate the doctor to the patient management system, detect which patient is near the doctor, and finally pull up the patient’s record. When the doctor leaves the room, all the information is saved and the doctor’s workstation powers down.

These are just some of the ways context-aware computing is going to change how we interact with our devices. It’s fair to say, your information devices soon will have your back.

Topics: Processors, Hardware, Networking

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

3 comments
Log in or register to join the discussion
  • This is truly pathetic.

    NT
    IT_Fella
  • Context-Aware Binaural Headset

    Aren't there any context-aware binaural headset that I can make use of? I'm hearing impaired but have no problem using a headset such as <a href=http://www.amazon.com/gp/product/B001E45XT4/ref=oh_details_o00_s00_i01>Plantronics GameCom 367 Closed-Ear Gaming Headset</a> for my Yealink SIP-T22P using a <a href="http://www.amazon.com/gp/product/B003EALLHE/ref=oh_details_o00_s00_i00">PC Headset to RJ9 adaper</a>. For my Samsung Exhibbit II 4G, I'mm using my <a href="http://www.amazon.com/gp/product/B003VTZPO8/ref=wms_ohs_product">MEElectronics M6P-BK Sports Sound-Isolating In Ear Headphones with Microphone/Remote</a>.

    If just using left or right ear, I'll have trouble understanding what I hear over the phone. I don't make and receive phone calls a lot, though.
    Grayson Peddie
  • Get on with it

    We already have wearable PCs. They are called smartphones and they are in more and more pockets and purses. Kinect like hardware is being imbedded in new TVs. If someone would give us a "headset" or just glasses with Kinect sensors, speakers/earbuds and transparent display that connect to our phones with WiFi or Bluetooth, then we can have truly mobile computing with gestures and virtual keyboards. Once we have the hardware, contextual awareness and overlaid info are just software development.

    The tech already exists and lots of companies like MS and Google as well as a number of smaller companies are trying to bring such devices to market.

    So stop talking about it and get on with it ;-)
    tonymcs@...