This useless app knows if you're sad (and that's a game changer)

Researchers are giving away powerful facial image analysis software for free
Written by Greg Nichols, Contributing Writer

I love my job

One of my favorite SNL fake commercials is for a product called the Home Headache Test. "I feel fine," says a concerned Kevin Nealon, "but what if I have a really bad headache and don't know it?" The handy HHT test lets him know where he stands.

I couldn't help thinking of the skit when trying out an app called IntraFace from the Human Sensing Lab at Carnegie Mellon University. The app uses an iOS- or Android-powered device's rear-facing camera and some nifty facial tracking technology to tell you how you're feeling. A real-time applause-o-meter style readout gives you your results based on five categories: sad, disgust, surprise, neutral, and happy.

You can play with IntraFace for free by downloading it from iTunes or Google Play.

The app is useless except as a fun demonstration and savvy PR tool for the Human Sensing Lab. But the facial image analysis technology that powers it, which is the work of a team led by the awesomely named Fernando De la Torre, an associate professor at CMU's Robotics Institute, is incredibly powerful and broadly applicable in a near-future when machines will be able to respond to human emotion and anticipate human needs.

At your service: 8 personal assistant robots coming home soon

The Human Sensing Lab has made IntraFace software available for download free of charge, making this the latest offering in a promising trend that's keeping robotics open source and collaborative. What developers make of the technology is anyone's guess, but potential applications include monitoring the emotional state of patients and detecting whether a speaker is losing an audience's attention. Further down the line the technology will enable personal assistant robots to respond accurately to users' needs in the home or at care facilities.

"IntraFace provides a breakthrough in facial feature tracking that simplifies the problem of facial image analysis, working rapidly, accurately and with such efficiency that it can run on most smartphones," says De la Torre. "Now it's time to develop new applications for this technology. We have a few of our own, but we believe there are lots of people who may have even better ideas once they get their hands on it."


A promotion!?

Researchers at Duke University have already incorporated IntraFace into a research app that will test the reliability of facial expression analysis as a screening tool for autism.

Facial analysis is not new, of course. What distinguishes the CMU software is that it's accurate and fast. It occupies less computer memory than previous methods and requires less power to run, making it suitable for use on a wide range of platforms, including smartphones and embedded systems, for example in robots. The early focus on reading emotion is also telling, filling a notable capability gap that has slowed development for real-time facial image analysis applications until now.

Editorial standards