Enthusiasts and third-party developers aren't the only ones dabbling with Microsoft's Kinect sensor. Researchers who work for Microsoft, are, too.
At Microsoft's annual TechFest research fair in Redmond this week, some of those researchers are showing off their Kinect-centric projects. One of these projects -- the Holoflector projection/augmented reality mirror already made its debut during Microsoft's pre-TechFest media event a week ago.
On March 6 at the opening TechFest day -- where Microsoft allows some of its press pals and invited guests to get a preview before closing the doors and turning the event into an employee-only one -- Microsoft showed off some additional Kinect-centric research projects. (Note: As I am not at the event, I can only link to Microsoft's descriptions and photos.)
Additional Microsoft Research projects using Kinect:
New webcam hardware/software prototypes integrating the Kinect sensor: Microsoft researchers showed off a prototype webcam with a much wider view angle than traditional webcams, which can capture stereo movie and high-accuracy depth images simultaneously. "Users can chat with stereoscopic video. Accurate depth-image processing can support not only all Kinect scenarios on a PC, but also a gesture-control user interface without a touch screen," according to Microsoft's write-up. "Besides computer vision, the webcam includes a hardware accelerator and a new image-sensor design. The cost of the design is similar to that of current webcams, and the webcam potentially could be miniaturized as a mobile camera," the Softies added.
Beamatron: Another augmented-reality concept that combines a projector and a Kinect camera on a pan-tilt moving head. The moving head can place the projected image almost anywhere in a room, while the depth camera enables the correct warping of the displayed image for the shape of the projection surface. How could this be used in the real world? "A projected virtual car can be driven on the floor of the room but will bump into obstacles or run over ramps," the Redmondians wrote.
SpatialEase: An Xbox 360 Kinect game for learning the language of space using 'embodied' learning that connects language with thought and action. From the description: "The learner must quickly interpret second-language commands, such as the translation of 'move your left hand right,' and move his or her body accordingly.
Kinect in the dark: "Kinect technology can open up new interactions in the dark, for example helping us to 'feel' an invisible shape through sound feedback," Microsoft researchers explained. There's an accompanying dimly-lit video clip that is meant to highlight what's going on with this research.
Shake n' Sense: A Microsoft Cambridge project that looks to mitigate interference when two or more Kinect cameras point at the same scene. It makes use of mechanical augmentation of the Kinect and doesn't require modification of the Kinect firmware, host software or inner guts.
While the Kinect and natural-user-interface technology is the darling of Microsoft Research brass these days, Microsoft also is showing off a number of non-Kinect-based projects at TechFest this week. Among them are several projects making use of new search techniques. There also are a few Azure-based demos, including something Microsoft is describing as "Bing-enabled Azure data services for the enterprise."
From Microsoft's write up of this Bing/Azure/services mash-up:
This project "identifies key Azure data services that have the potential to be widely useful for enterprises by leveraging the combination of Bing data assets, the Microsoft cloud computing infrastructure and deep data analytics. To bring home the opportunities, the project shows how Microsoft’s enterprise software can leverage these data services, and illustrates Bing-enabled enhancements that SharePoint Search and Microsoft Office products and services can potentially leverage."