Most of the TechFest 2011 projects that Microsoft is highlighting visibly are those that focus on using gestures, touch, computer vision and speech as ways to interact with PCs and other computing devices.
While Microsoft's most celebrated example of how the company successfully commercialized its early NUI work is the Kinect Xbox sensor, there's also quite a bit of focus by Microsoft's NUI researchers on the intersection between NUI and healthcare. I blogged earlier this year about how Microsoft is thinking about incorporating Surface 2, Xbox, Xbox Live and Kinect sensors into healthcare applications. There are a number of Microsoft Research projects exploring the NUI-health connection.
One of these project, known as InnerEye, is focused on "the automatic analysis of patients' (medical) scans using modern machine learning techniques," like semantic navigation and visualization. The Microsoft researchers on the project are working with the Microsoft Amalga team on Inner Eye, which is one of the demos being showcased at TechFest 2011.