With 'people aware' machines, human behavior benefits

Tanzeem Choudhury wants machines to better understand humans. With a clearer picture of how we interact with the world, our cell phones could actually help us recognize ways to improve our lives (more exercise!).
Written by Christina Hernandez Sherwood, Contributing Writer

Tanzeem Choudhury wants machines to better understand humans. With a clearer picture of how we interact with the world, our cell phones could actually help us recognize ways to improve our lives (more exercise!).

Below are excerpts from my recent interview with Choudhury, an assistant professor of computer science at Dartmouth College.

Your work focuses on building computer systems that are aware of humans. What does that mean and why is it important?

It is focusing on our behavior and how we engage with the rest of the world. It starts from very basic things, like where we spend our time, how physically active we are, how much conversation we have. Research has looked at location and computing. Beyond that, [we're interested in knowing] what people were doing where, how they were engaged with people around them, how they fit into their environment. It's the whole spectrum of how we as people live our lives. One of our areas of focus recently has been in the health domain. There has been evidence that our lifestyle affects health outcomes, both physical and emotional well being. It's hard for [health providers] to access information at a large scale about people's lifestyle to connect it with health. That's one example of why it would be useful to get data about people and their behavior.

We're interested in delving into education. We've started to look at new types of sensors to try to capture in a mobile setting some simple brain activation. We're hoping to find if in a classroom setting we can tell from people's behaviors and their brain signals whether they are understanding the material that's being delivered. We're trying to open this up to be able to observe people in their own environment for a long period of time. Ultimately, the goal is to not only make this accessible to health care providers or educators, but also create tools that would give people control. They would allow people to understand how their behavior affects different factors of their life and help them make the changes they'd like to make.

Talk more about the technology that would give people information on their behaviors.

That's something we're actively pushing. In health, other researchers and I are looking at how giving people information about their activities continuously [leads them to] actually exercise more. We're trying to extend that. Your phone is with you all the time. It can collect information about you and give you feedback instantly. In multiple stages, we're trying to give that feedback. If you know you're gradually engaging more with the people around you or you're becoming more isolated, just the awareness might help you make the right changes. It allows you to see how you compare against your peer group. Are you doing as well as you should be?

It might give you some early indicators of things that might go wrong. There are behavioral cues when people get sick. If you can show that to users, they might be able to take action sooner. They could be proactive about having a healthier life, as opposed to getting treatment when they're sick. It's giving you that link. We do a lot of things naturally and it can be hard to dissect things, to say: Because of my work, I've cut down on how much exercise I do. Those things are hard to notice. But if you pull those apart and provide that in a form that people can understand and take action on, it can be very useful.

It's almost like Mint. But instead of keeping tabs on your personal finances, it keeps track of your daily activities.

It's a good analogy. It's encouraging because if you give people the information, they'll do the right thing. A former colleague of mine who used some of the systems I worked on to do a study on exercise saw that people stuck to their goals more if they had feedback. We did some studies looking at both the physical, as well as social well being. If you give them the feedback, not only are they better in terms of their wellness scores. They're also better at knowing how their behavior is affecting things. They're recalling and connecting life events. [A participant] was busy and stopped going to the gym. They became less socially engaged. It helps them uncover some things we might not directly detect. It might allow them to connect to other events in their life that we're not measuring.

How does this relate to your work in reality mining?

They're connected. Reality mining has more of a focus on networks of people. You're mining interactions with people. In organizational settings, how likely are people to buy or engage with your company? Who is the dominant person in an organization that has the most influence? When I did some of my early work in that area, it was basically the first system that used sensors to build social networks automatically. We were able to show from observing behavior that you can link low-level behavior. We looked at how people change the way they talk when they come together and we established a connection between the network positions of people.

People-Aware Computing, my group now, is not just limiting to social networks. It's broadening what you can do. We look at individual groups and social networks and how one can benefit from the other. There is a connection in both directions.

What are you working on now with People-Aware Computing?

We wanted to continuously study people's behavior using systems that could be scalable. For that, we were looking at mobile systems and cell phones in particular. Cell phones have location, sound, images, motion. It's the computer understanding the world like we do.

We've started to explore what else we could do. One of the areas is portable sensors. Companies are building EEG headsets that look at brain activation. We have an early prototype where we used this headset for a phone dialing application. It would flash pictures. If you saw the picture of the person you wanted to call, it would call automatically. It would get the signal from your EEG and dial. Rajeev Raizada, a cognitive neuroscientist at Dartmouth, was one of the key collaborators on the project. [Here's a video of brain to mobile phone interface.]

In neuroscience, there have been findings related to fMRI where individuals in a story-telling situation have synchrony between their brain signals. The level of synchrony indicates how well they understood that story. We'd like to apply this to find out how much students in a classroom setting understand material that's being conveyed.

Another project was in the health space. Our population was about 80 years old. Using these mobile sensors, we looked at how much they walk, whether they take the stairs, how much they talk to each other. We found that people doing strenuous activities and walking on a regular basis correlated with how physically fit they were according to the doctor. None of these individuals had been diagnosed with any mental health problems, but our sensor measurements found one of the outliers who was clinically depressed. One of the problems in the mental health space is that early detection is hard to do. If there are ways of detecting that early, it might help in the management of mental health [issues].

We're looking for the right way to convey [feedback] to individuals, so they can make changes that are good for them. We want to give people subtle nudges, so they make behavioral changes that would help give them a better quality of life. At the doctor's office, people are given recommendations. This would move that to when they really need it.

[My two main collaborators are Dr. Ethan Berke at the Dartmouth Medical School and Andrew Campbell, a professor in the college's computer science department.]

What's the most challenging aspect of your work?

From the technology side, we know how to detect simple behaviors across lots of people. But people are variable. The challenge is dealing with variability. When we were detecting walking in elderly individuals, their walking was significantly different from how 20 and 30 year olds walk. To deal with this, we try to look for patterns in groups of people. Although people are different, there are similarities we can take advantage of to make our systems more accurate and robust. How can computers take all this data and automatically figure out how to build systems that work across groups? We want to do it invisibly.

You want to feed data back to the users, but just feeding back data is not useful. It might be intriguing for awhile, but you want to do it in a way that motivates people and persuades them to make behavioral changes. That's a challenge. In psychology, if you're trying to persuade someone, telling them something positive works better than negative. It's about finding the key changes that will be beneficial for the individual.

Photo: Tanzeem Choudhury

This post was originally published on Smartplanet.com

Editorial standards