Here's the scene: you're traveling, and you walk into a little restaurant and the menu is entirely in a language you don't understand, without pictures. You've got a couple of choices. You can leave, and try to find a place with English translations. You can try to hack your way through a conversation with the waiter, who also doesn't speak your language. Or, you can point randomly at the menu and live with the consequences.
Well, in the future there will be another, better, answer. Live, realtime translation built into your glasses. Enter: Project Glass. British hacker and DIYer Will Powell has built a pair of glasses that can (albeit roughly) project a translation of your conversation onto your glasses. Here's what it looks like:
Will describes how he put the set together on his blog:
The individual using the glasses be wears the Vuzix 1200 Star glasses which are connected to the s-video connector on the first raspberry pi and the Jawbone bluetooth microphone that connects to a device such as smartphone or tablet to provide a clean noise cancelled audio feed. The bluetooth microphone streams across the network of what I say and what it picks up around me. This is then recognised and passed through Microsoft’s translation API with a caching layer to improve performance of regularly used statements. Passing through this API service is the biggest delay in the subtitles. Once translated the server passes back the text and translations that are picked up by the raspberry pi driving the TV and glasses displays. Elizabeth uses a headset mic but could user her own raspberry pi, glasses and jawbone microphone to have the same experience as I do.
The machine is built from the following components:
2x Raspberry Pi running debian squeeze
Vuzix 1200 Star
While the translation isn't good enough for use in, say, the military or diplomacy, where rough translations simply don't cut it, for the rest of us just trying to avoid ordering the pickled shark it could be a life saver.