Plenty of the 'new' technology we'll see in 2011 is technology we've seen in 2010 (or even 2009), done better. There are some technologies we're excited about that just aren't ready for primetime. One of them is mobile augmented reality, which is just too shaky and imprecise today.
Augmented reality is commonplace on TV - and way ahead of what you can do on mobile, according to Ryan Ismert, the director of engineering at Sportvision, which does augmented reality for US sporting events. Viewers don't think of it as AR; they think of it as the yellow line marking key points of the football field or the national flags that show up as virtual lane markers for skaters in the winter Olympics or the line that paints in the trajectory of pitches in baseball games.
Sportvision creates the At Bat application for Major League Baseball but while that has the stats from the games, if you're in the baseball stadium you can't hold your iPhone up and get live AR overlays, because what you can do in mobile AR is just too simple today. Layar is state of the art for mobile AR but Ismert says it doesn't begin to approach what consumer are used to. "The grid bobs around as you hold the phone. There's no occlusion so you see everything in a particular direction whether it's visible or not. The orientation doesn't match perfectly." (We agree with Ismert. Using the AR view in Yelp or the Lonely Planet Compass Guide makes you feel like you have X-ray vision in the middle of an earthquake.) "On TV we're tracking the baseball at 80, 90 miles per hour and when we're plotting where it crosses the plate we have half-inch accuracy up and down - and we do it with 30 milliseconds and two frames of latency. We position the yellow line within plus or minus two inches - and we do it from a camera that's 200 yards away from the pitch. People are used to this sort of accuracy."
The bar on what you expect a mobile phone to be able to do is artificially low, says Ismert, but users will start to expect what they see on TV soon he predicts; "we have that luxury for another 6-12 months". The technology has to improve in several ways: better sensors like gyroscopes and accelerometers for positioning so the overlays are actually in the right place. Network speeds need to be faster, and you need to be able to guarantee a consistent speed; "whether we're doing computation in the cloud and shipping video back for better results or shipping information to the phone to calculate, we need a better network connection and quality of service." And for the intensive calculations needed to show something useful in the right place fast enough for it to be interesting, he wants hardware acceleration and parallel processing like the GPU calculations that PC software is starting to rely on; "we desperately need it on the handsets - although I know it's a tradeoff with what we also need, which is energy efficiency," admits Ismert. And while they've done iPhone and iPad apps so far, when he gets hardware-accelerated handsets Apple's level of control may be an issue. "We will run our processing over the raw camera input to stabilise the video - so we need open systems."
There's money in sports, so companies like Sportvision have an incentive to push mobile AR like this. And once your phone can lock on to a baseball going at 80 miles an hour on the other side of the stadium, it should be able to show you which building on the other side of the street is the restaurant you're looking for without bouncing around like a platform game.
P.S. We'll be doing some more predictions of what technology will improve from almost to actual next year; what are you expecting?