Android Wear -- wearable devices built around relevance

Android Wear -- wearable devices built around relevance

Summary: Google is able to offer something with Android Wear that no other player, big like Apple or small like Pebble, can offer. And that's why it'll win...

SHARE:
TOPICS: Mobility
2
Android Wear Apple iWatch

A lot of people, rather disparagingly, call Google an "advertising company". Sure, advertising is how they monetise what they do, but the real business they are in is this:

They're in the relevance business.

The video Google put out alongside the Android Wear announcement, called "Information that moves with you", clearly demonstrates their intentions with their new wearables platform. Here it is:

Results

Google's main product has always been their search engine. If we wind back the clock to its introduction, the reason why Google won over competitors like Yahoo! and Altavista was that their results were considered to be "better". "Better" is a little wooly though -- what people meant by was actually "more relevant".

In fact, Google's intention is that the first organic search result exactly matches the intention of the users search. To put it another way, Google's intention is that the first organic search result has maximum relevance.

If we look at AdWords, that desire for relevance continues. In theory, when you pay for ads on Google you can put any ad that you like against any keywords. In reality, Google rewards advertisers that zero in on relevance.

The price Google charges for AdWords is bound up to a "quality score". In fact, this is a "relevance score". The more relevant the ad is in relation to the keywords, and the more relevant the landing page is in relation to the ad, the higher the quality score, and the lower the cost of the ad.

You can see the desire for relevance flowing through into Google's other products. Google+ is about allowing others in a user's social network propagate up relevant content. Google Maps is about getting relevant local information to a user based on where they are. Everything Google does is about relevance.

Relevance is important in everything that happens in the user's online life. There is so much content flying at any given user, the only time a user will engage is if it resonates -- i.e. it has to be relevant.

You're only reading this article because you feel it's relevant. Moreover, you would have made that judgement to read it most likely solely based on the headline. Something about the headline of this article made you click on it because it seemed relevant.

Now

For Google, if you're looking to build products that are more and more relevant, and obvious gap to close is one whereby you take an explicit query from the user out of the equation. Search engines can only return relevant results if the user actually types in a query.

That's where Google Now comes in. The objective of Google Now is to return relevant information without the user having to type in a query.

Read: Google's Android Wear secret sauce -- the cloud 

For example, telling the user that there is traffic on their normal route home just before they start out on their normal route home. That's immensely relevant information to someone who's trying not to get stuck in traffic, and it can happen without the user having to do anything.

When you look at the video for Android Wear presented at the top of this article, what you see is something that is primarily about Google Now. The fact that you wear it on your wrist is secondary.

Here's a look at each scene from the video:

  • 0'13" -- a man's Android Wear watch tells him how many stops remain on his journey without him having to ask. (Implying Google knows his typical route home.)

  • 0'16" -- a man emerging from an airport is recommended a taxi company rather than having to wait in line. (Implying Google knows he's just got off of a plane. Perhaps Gmail has seen the booking confirmation from the airline.) He can also confirm the pick-up from the new company. (Implying Google can monetize the relationship with the taxi company.)

  • 0'22" -- a woman arrives at the beach and is told that there's a jellyfish warning. She's recommended some other local beaches. (This is an easy one -- easy enough to tell she's at the beach.)

  • 0'35" -- a man is told there is no traffic on his route to work. (Again, implying Google knows his typical commute.)

  • 0'43" -- the man on the tram gets a text message and can reply. This is a rare example in the video of something that's dependent on the local device, not Google's services. It's also the only one that won't work -- we're 40 years away from speech recognition working that well in that environment -- witness the constant disappointment that is Siri.

  • 0'52" -- a woman running through an airport is told how many calories she's burned. This is a hint about self-quantification capabilities of the platform.

  • 0'55" -- a woman presents a QR code that acts as a boarding card. (Google knows not only that she's at an airport, but also which flight, and that she's at about the correct location in space and time to present the boarding card. This one seems a bit sketchy to me.)

  • 1'00" -- another text message example, but this one shows the user searching online for structured information through a voice interface.

  • 1'08" -- perhaps the most crazy example. The device detects the woman is dancing, and then offers to name the music that she's listening to. Much as I don't think this will work, the important thing here is the relevance of the query: "you're dancing to something, do you want to know what to".

  • 1'20" -- not clear what's happening here, but there's an implication the device knows the user has arrived home, and automatically opens the garage door.

The point of all those is that in virtually every example this is nothing to do with the device. The secret sauce in all that is the software and data that's held in Google's servers.

Conclusion

The positioning of Android Wear illustrated in the video highlights a huge problem for Apple. Google is staggeringly good at mining meaning from a user's online activity. This is everything that they have been doing as a business since their inception.

Apple can't do any of this -- they simply don't at this point in time the data the mine, or the experience to bring to bear on the data even if they did. All they can do is what Pebble does -- i.e. forward notifications from the phone.

Microsoft doesn't either. The organisations Microsoft remains stuck in enterprise-thinking, where data is about transactions first, and meaning second. This is the other way round to Google.

Pundits like talking about the potential of wearables. As I mentioned above, I personally like warning my Pebble smartwatch. But a general market wearables product has to provide relevant information, and right now only Google is poised to be a source of such information.

Products from other vendors look like their might offer very paltry results in comparison.

What do you think? Post a comment, or talk to me on Twitter: @mbrit.

Image credit: Jason Perlow

Topic: Mobility

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

2 comments
Log in or register to join the discussion
  • Googles customers are companies that pay for advertising

    Google sells consumer information to companies that want to advertise on to Google users, for money. Lots of money.

    Therefore Google is an advertising company.

    You can try to sugar coat it, cover it up, camelflage it. But in the end, Google makes boatloads of money selling information about consumers to companies, and displaying ads to consumers.
    bathswana
  • speech Recognition

    I just had to comment on the issue below regarding speech recognition.

    Are you serious? You obviously don't use Google's speech recognition service in Google Now. It is spot on. I use it all the time in crowded and noisy environment. It is surprisingly accurate. Saying "I'll be there in 2" is so simple for Google to recognize. In fact, I've had it recognize much more complex sentences. Saying it is 40 years away is simply absurd and completely laughable!

    Just the other day I was in a crowded Irish bar on St. Patrick's Day and asked Google to "define Machiavellian." I don't know about Siri, because I don't use iPhone as my main phone, but Google's speech services are incredible.


    ---
    0'43" -- the man on the tram gets a text message and can reply. This is a rare example in the video of something that's dependent on the local device, not Google's services. It's also the only one that won't work -- we're 40 years away from speech recognition working that well in that environment -- witness the constant disappointment that is Siri.
    qwerty123123123123