X
Tech

Eve, the virtual math teacher

New Zealand computer scientists have developed Eve, an affective tutoring system (ATS) which can adapt its response to the emotional state of children by interaction through a computer system. The researchers say their teaching system, dubbed 'Easy with Eve,' is the first of its type and add they 'wanted to create a virtual teacher capable of reading and understanding body language and facial expressions to ensure that it has the attention of students.' Today, Eve teaches mathematics, but this 'robotic' intelligent system could be adapted to other situations and become an important development in the $25 billion e-learning market.
Written by Roland Piquepaille, Inactive

New Zealand computer scientists have developed Eve, an affective tutoring system (ATS) which can adapt its response to the emotional state of children by interaction through a computer system. The researchers say their teaching system, dubbed 'Easy with Eve,' is the first of its type and add they 'wanted to create a virtual teacher capable of reading and understanding body language and facial expressions to ensure that it has the attention of students.' Today, Eve teaches mathematics, but this 'robotic' intelligent system could be adapted to other situations and become an important development in the $25 billion e-learning market.

Eve, the virtual math teacher

You can see above three pictures of Eve, an affective tutoring system (ATS) in action. From left to right, she shows no expression, she is smiling, and is speaking while pointing at a board. (Credit: Massey University) If you prefer animation, here is a link to a short video of Virtual Eve introducing herself (QuickTime format, 59 seconds, 2.20 MB).

This project has been led by Dr Hossein Sarrafzadeh, Senior Lecturer in Computer Science at the Auckland-based Institute of Information and Mathematical Sciences (IIMS). He is also the leader of the Next Generation Intelligent Tutoring Systems project (NGITS). Some of his collaborators include Chao Fan and Sam Alexander (the above picture comes from his NGITS page).

But what rules inspired the development of this system? "Because one-to-one teaching is known to be the most effective teaching method, Dr Sarrafzadeh says the researchers wanted to create a virtual teacher that could pick up body language and facial expressions -- like a real teacher -- to interact and to ensure they are holding the attention of students. He says the realisation that software systems would significantly improve performance if they could adapt to the emotions of the user has spawned research and development in the field of affective or intelligent tutoring systems."

And how did the researchers build their virtual tutor? "Linked to a child via computer, the animated character or virtual tutor can tell if the child is frustrated, angry or confused by the on-screen teaching session and can adapt the tutoring session appropriately. The animated Eve (with a human-sounding voice) can ask questions, give feedback, discuss questions and solutions and show emotion. To develop the software for this system the Massey team observed children and their interactions with teachers and captured them on thousands of images."

The latest research work about this project has been published in Computers in Human Behavior, an Elsevier journal, under the name "'How do you know that I don't understand?' A look at the future of intelligent tutoring systems" (Available online since September 14, 2007). Here is a quote from the abstract. "This paper presents research leading to the development of Easy with Eve, an ATS for primary school mathematics. The system utilises a network of computer systems, mainly embedded devices to detect student emotion and other significant bio-signals. It will then adapt to students and displays emotion via a lifelike agent called Eve. Eve’s tutoring adaptations are guided by a case-based method for adapting to student states; this method uses data that was generated by an observational study of human tutors. This paper presents the observational study, the case-based method, the ATS itself and its implementation on a distributed computer systems for real-time performance, and finally the implications of the findings for Human Computer Interaction in general and e-learning in particular."

If you want to read the full article, be prepared to pay $30. But here is Sarrafzadeh's conclusion -- for free. "When we interact with people we expect them to take note of our feelings and reactions. Soon we will be able to expect the same from a computer."

Sources: Massey University News, New Zealand, November 16, 2007; and various websites

You'll find related stories by following the links below.

Editorial standards