Today, at the Computer History Museum in Silicon Valley, Google is hosting an event to talk about search, specifically the ways that search continues to evolve as the technology changes, as the devices used to search get smarter and as the content on the Internet includes more than just Web sites, things like images, blog posts and video.
At the event, the company talked about different tools that are are coming on-board, including some cool tools involving voice search, image search and on-the-fly translations. Specifically, the company used Andoid and iPhone devices to highlight what's coming.
But the company was just setting up for the big news announcement: real-time search. (Techmeme) It's a very cool enhancement that makes search - just your regular everyday queries on the regular Google home page - much richer. In the demo, the company typed a search for "Obama" and came back with a number of results, as one might expect. But within the results is a real-time stream, if you will, that scrolls with the most-up-to date results - blogs being posted, tweets being sent, news articles being published. (See video at bottom of this post for better understanding of how this works.)
As part of the this, there's also a new link being added to the results page - called "Latest" - that offers blog posts that were posted seconds ago, as well as one called "Updates," which incorporates things like tweets, stuff that's being said right now.
Oh, yeah, and it also works on mobile devices (at least Google Android and iPhone devices for now)
And speaking of mobile, the company announced support for voice search in Japanese, joining English and Mandarin. It also announced Google Goggles - a visual search feature in Labs that allows users to use the camera of a phone and find information about it, whether a building, a logo, a book cover or more. (Techmeme)
The company also highlighted location-based Google Suggest upgrades, which populates queries based o where you are. For example, start a query with "RE" while you're in Boston and the first result is, of course, RedSox. If you're in San Francisco, that first result suggestion is REI, the retail store.
It also added a "What's Nearby" feature to mobile maps. For example, if you're in a strange town and maybe looking for something nearby while waiting for a meeting to start, there's a button for locating businesses - maybe a coffee shop, ATM or a bar. It's really quite a handy feature.
Finally, the company also announced new partnerships with Facebook, which will feed updates from public Facebook pages, and MySpace, which will feed updates from any member who makes his or her profiles public.
Some of the things the company talked are live now, others will be rolled out in the coming days and some will be rolled out early next year.
As the event's hostess, Marissa Mayer, Google's VP of Search Products and User Experience, spoke about the importance of search evolution - growing beyond URLs and adding in things like images, video and so on. But Mayer says the future of search is even bigger than that. The technology is smart enough to go beyond a few keywords typed into a box - especially in a mobile setting.
A device's microphone becomes the ear of the Internet, the speaker is its voice and the camera becomes its eye while GPS tells the search engine where we are at any given moment, making all of these tools even better.
Sure, Google is compiling more and more information about us - not just as we surf but also while we're mobile. And maybe if I were more paranoid, I might be freaked out about that. But I'm not. Google is tapping more information to make my online life better - saving time by giving me information I want when I want it and where I want it.
For people who want information at their fingertips, there's value there.