X
Business

DARPA's 'Social Puppet'

Game designers from the University of Southern California (USC) have developed 'Social Puppet,' a computer engine to "help soldiers learn unfamiliar languages by interacting with animated characters." For this project, financed by DARPA, the researchers have given on-screen characters human non-verbal communication behaviors.
Written by Roland Piquepaille, Inactive

Videogames creators are heavily using software to animate objects or characters without reprogramming them between two scenes. Now, game designers from the University of Southern California (USC) have developed 'Social Puppet,' a computer engine to "help soldiers learn unfamiliar languages by interacting with animated characters." For this project, financed by DARPA, the researchers have used their expertise in previous videogames used by the armed forces, such as "Tactical Iraqi." But previous games were focused on teaching language and customs while 'Social Puppet' is giving on-screen characters human non-verbal communication behaviors.

The 'Social Puppet' project is the brainchild of Hannes Högni Vilhjálmsson of the USC Information Sciences Institute, who designed the game.

Below are two images coming from the 'Tactical Iraqi' game. On this one, the character shown doesn't look very friendly (Credit: USC Information Sciences Institute).

Social Puppet: an hostile individual

But on this second one, it looks that he's welcoming you. And this change of attitude is easily obtained by pushing a button on the screen of the game designer (Credit: USC Information Sciences Institute).

Social Puppet: a friendly individual

So how does this work?

On the screen, computer generated game characters shrug, wink, nod, wave, or cross their arms with skeptical hostility as they follow your every move with attentive gaze. A University of Southern California-developed system module called "Social Puppet" is pulling the strings.
Once a given character is designed, a simple set of standard commands orchestrates a whole range of non-verbal expressions of mood and attention. The same commands work for any other character in the game. "Human communication is only partly verbal," says Vilhjálmsson.

For more information, you should read this longer news release which provides additional details.

"The sciences that study human social behavior have discovered a lot of regularities that we can program into virtual characters as baseline behavior. It has always been a dream of mine to take that literature and make it really accessible and useful to the game programmer," says Vilhjálmsson.
He gives examples of the kind of standard gestures he means. "Animating gaze as a function of turn-taking, or Animating a reaction to another's attempt to make eye- contact based on attitude and availability.
"These are things we take for granted in the real world, but have to be programmed into virtual characters," he continues. "So why do that from scratch every time when we can exploit and implement models that exist in the literature? As computer games become more socially interactive," he concludes, "there will be an increasing demand for 'plug-in' AI engines that orchestrate believable social performances."

And for even more information about this subject, you also should read The Tactical Language Training System (PDF format, 8 pages), which was presented at the First Conference on Artificial Intelligence and Interactive Digital Entertainment, on June 1-3, 2005 at Marina del Rey, California.

Sources: Eric Mankin, University of Southern California news release, February 15, 2006; and various web sites

You'll find related stories by following the links below.

Editorial standards