News Release

I2R Known-item-Search technology takes top honors at US standards annual TRECVID conference

Team from Institute for Infocomm Research emerges first in Known-item-Search task at National Institute of Standards and Technology TRECVID Conference 2010

Grant and Award Announcement

Agency for Science, Technology and Research (A*STAR), Singapore

Ever wished how you could zoom in on certain parts of a video again while searching through numerous video clips? Or seeing that favourite part of the clip that you were furiously hunting high and low for through numerous clips? Now replicate that search across 8000 videos and you can see how difficult that must be! Not so, if you are using the Known-item-Search technology from I²R.

A team from A*STAR's Institute for Infocomm Research (I2R) managed to obtain the top scores beating 49 participating global teams and emerging best among the final 15 teams. The final 15 teams included teams from the National University of Singapore (NUS), Carnegie Mellon University, City University of Hong Kong and Dublin City University.

The main goal of TREC Video Retrieval Evaluation (TRECVID) is to promote progress in content-based analysis of and retrieval from digital video via open, metrics-based evaluation. TRECVID is a laboratory-style evaluation that attempts to model real world situations or significant component tasks involved in such situations.

The I²R team, formed in less than four months, had to search through 8000 videos and handled 300 text queries that were used for evaluation. For each query, there is only one correct answer/video. The search also covered two categories, automatic and interactive. The queries are complex, often long and may contain errors (as in the real world scenario).

Professor Lye Kin Mun, Executive Director, I²R said, "I2R has always been looking out for opportunities to benchmark our technologies through participation in global competitive events. It is through taking up such challenges that we can discover our strengths and learn from our weaknesses, eventually delivering technologies that are globally marketable and internationally acclaimed."

In the automatic search category, I²R returned the search with mean inverted rank (See ANNEX A) of 0.454 in only 0.001 minutes per query, while in the interactive search category, we achieved mean inverted rank of 0.727, and took only 1.442 minutes per query.

The I2R team comprised of researchers from its Signal Processing (SP) and Computer Vision and Image understanding (CVIU) Departments, led by Dr Lekha Chaisorn, comprises of 7 team members: Mr Wan Kong Wah, Dr Zheng Yan-Tao, Dr Zhu Yongwei, Mr Kok Tian Shiang, Ms Tan Hui Li, Mr Fu Zixiang, and Ms Susanna Bolling.

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.