X
Business

S'pore tops multimedia search contest

Intelligent, "self-learning" multimedia search retrieval system helps team from National University of Singapore emerge winner of US$100,000 cash prize.
Written by Vivian Yeo, Contributor

SINGAPORE--A team from the National University of Singapore (NUS) has beaten teams from 17 countries in a global contest showcasing multimedia search technologies.

For winning the Star Challenge 2008 on Thursday, the team comprising three researchers at the NUS Lab for Media Search walked away with US$100,000 in cash prize and the honor of having the most efficient multimedia retrieval technology.

Launched in January this year, the Star Challenge attracted 56 teams, which whittled down to five after three elimination rounds held from June. Apart from the NUS, teams from China's Peking University, France's Laboratoire d'Informatique de Grenoble, Japan's National Institute of Informatics and the United State's University of Illinois at Urbana Champaign made it to the final round. The Star Challenge was organized by Singapore's A*Star (Agency for Science, Technology and Research).

During the final, the five finalists had to navigate through an island in virtual world Second Life, solving puzzles to obtain clues to perform five search main tasks--two audio-related, two video-related and one with elements of both--within a time limit of two hours.

Speaking to ZDNet Asia before the commencement of the final, Victor Goh, leader of the NUS team, said his team focused on interactivity and "self-learning" for their multimedia search engine--allowing users to provide feedback to the system, which then automatically incorporates the feedback and makes the necessary adjustments.

"We're trying to incorporate artificial intelligence--let the system self-learn [by capturing] user intention and context," explained Goh. "When the user goes through two rounds of 'feedback', we'll roughly know what kinds of features, concepts, ideas or things the user is looking for, and the system will learn by itself to go toward that direction."

According to Goh, the trio had put in extra hours in the lab after work over the past ten months, since the global competition was launched. They studied search engines from Google, Yahoo and YouTube, to determine ways to make searches more accurate as well as how the technology could be made more user-friendly.

"The flaw or drawback of these systems is they do not provide an interactive search for the users to narrow down to what they want. It's not user-friendly enough; you still need to browse through a lot of links to get what you want," Goh said. Google, for instance, allows users to "find similar pages" but that capability is text-based, he pointed out, adding that such engines lack understanding of video content.

To make their technology more relevant, Goh's team also monitored the types of content people uploaded to YouTube. This helped them gauge the focus of users' interest, for inclusion into the system.

According to A*Star, current search engines are text-based and can locate only multimedia material that has been tagged in text. Tagging is laborious and therefore not routinely done; even when content has been tagged, there may be inaccuracies or inconsistencies.

The Challenge provided a platform for international collaboration in the area of multimedia search, Vivek Singh, chairman of the organizing committee for Star Challenge, told ZDNet Asia in an interview. It is also a step forward in the direction of making such technologies mainstream, Singh noted.

However, he pointed out that efforts were "still very much research at this point", and it will likely take another five years before basic-level multimedia search is available commercially. "[Multimedia search] is many times more complex than [text-based search], and text itself took over 10 years to mature," he said.

Goh noted that the NUS team currently has plans to hold on to the intellectual property associated with the project, but is open to the possibility of working with companies such as Google and Microsoft.

Editorial standards