Project

Culture Mining Describing and Searching Audiovisual Content

A collaborative project to research how audio and video content online can be categorised, searched for and retrieved by users.

Research and Development Composite

Research and Development 2005–7: Julia Kristeva © Tate Photography.
Olafur Eliasson The Weather Project © Olafur Eliasson, Photo © 2003 Tate, London
Grayson Perry, Photo © John Napier, courtesy Victoria Miro Gallery, London.
Yinka Shonibare, Still from Un Ballo in Maschera (A Masked Ball) © 2004, Commissioned for the Moderna Museet, Stockholm. Produced by Moderna Museet and Sveriges Television. Courtesy Stephen Friedman Gallery, London

A large volume of audio-visual content online

Artists, museums and the heritage sector are creating ever-increasing amounts of audio-visual content. One of the biggest issues facing the museum and heritage sector over the coming years will be how to manage and distribute that content to the public and research sector.

Tate is working in collaboration with Goldsmiths College, University of London, Department of Computing and the Goldsmiths Leverhulme Media Research Centre, as part of the Metadata Project, to produce an open-source application for tagging, searching and retrieving audio/video content online.

Intuitive and collaborative tagging and searching

The application aims to demonstrate the potential for an intuitive search engine that allows new forms of tagging and searching audio-visual content: time-based, intensity-based and collaborative. Users will not only be able to tag videos as full entities, but tag any moment in a video or audio file. They will also be able to set the intensity of a tag: a moment in a video or audio file can therefore be described as, for example, ‘very much about Jean-Luc Godard’ or ‘a little bit about Jean-Paul Belmondo’. Users will tag and set the intensity of the tags in collaboration with other users, along the lines of a wiki for tags. 

Users will also be able to search content in very detailed ways. They will be able to identify relevant moments directly and see what other users got passionate about. Topic-specific heat maps and time-dynamic tag clouds will provide a new experience of using audio-visual content. Tagging and retrieval will be unified in one intuitive user experience.

A user-centred tool

Ultimately, the project aims to develop a user-centred tool that will allow audiences and academics to quickly tag, search, retrieve and play results drawn from large volumes of long-play content, as well as collectively negotiate their meaning. 

See also

Close