Culture Mining Describing and Searching Audiovisual Content

November 2005 – June 2010

A collaborative project to research how audio and video content online can be categorised, searched for and retrieved by users.

Research and Development Composite

Research and Development 2005–7: Julia Kristeva © Tate Photography.
Olafur Eliasson The Weather Project © Olafur Eliasson, Photo © 2003 Tate, London
Grayson Perry, Photo © John Napier, courtesy Victoria Miro Gallery, London.
Yinka Shonibare, Still from Un Ballo in Maschera (A Masked Ball) © 2004, Commissioned for the Moderna Museet, Stockholm. Produced by Moderna Museet and Sveriges Television. Courtesy Stephen Friedman Gallery, London

A large volume of audio-visual content online

Artists, museums and the heritage sector are creating ever-increasing amounts of audio-visual content. One of the biggest issues facing the museum and heritage sector over the coming years will be how to manage and distribute that content to the public and research sector.

Tate is working in collaboration with Goldsmiths College, University of London, Department of Computing and the Goldsmiths Leverhulme Media Research Centre, as part of the Metadata Project, to produce an open-source application for tagging, searching and retrieving audio/video content online.

Intuitive and collaborative tagging and searching

The application aims to demonstrate the potential for an intuitive search engine that allows new forms of tagging and searching audio-visual content: time-based, intensity-based and collaborative. Users will not only be able to tag videos as full entities, but tag any moment in a video or audio file. They will also be able to set the intensity of a tag: a moment in a video or audio file can therefore be described as, for example, ‘very much about Jean-Luc Godard’ or ‘a little bit about Jean-Paul Belmondo’. Users will tag and set the intensity of the tags in collaboration with other users, along the lines of a wiki for tags. 

Users will also be able to search content in very detailed ways. They will be able to identify relevant moments directly and see what other users got passionate about. Topic-specific heat maps and time-dynamic tag clouds will provide a new experience of using audio-visual content. Tagging and retrieval will be unified in one intuitive user experience.

A user-centred tool 

Ultimately, the project aims to develop a user-centred tool that will allow audiences and academics to quickly tag, search, retrieve and play results drawn from large volumes of long-play content, as well as collectively negotiate their meaning. 

In collaboration with Goldsmiths College, University of London, Department of Computing and the Goldsmiths Leverhulme Media Research Centre
Supported by the Arts & Humanities Research Council ICT Programme and by the Leverhulme Foundation

Project Information

Project type
Research project
Digital project
Lead department
Tate Research
Support department
Tate Research
Project leaders
Kelli Dipple, Tate Curator of Intermedia Art (project leader for Tate)
Professor Robert Zimmer, Head of Department of Computing (project leader for Goldsmiths)
Project team
Marian Ursu, Lecturer, Goldsmiths (initial model)
Adrian Passow, PhD student, Goldsmiths (preliminary research)
Nicolette Cavaleros, MA Digital Art History intern, Birkbeck College (preliminary research)
Yuk Hui, PhD student, Goldsmiths (final concept, technical infrastructure)
Goetz Bachmann, Research Associate, Goldsmiths (final concept, ethnographic research)
Andrea Rota, PhD student, LSE (database architecture)
Brigitte Kaltenbacher, designer (interface)
Darren Wiliams, designer (interface)

See also

  • Project

    Intermedia Art

    A programme of new media, sound and performance commissions, broadcasts, events and articles for the galleries and online.