[Back]


Talks and Poster Presentations (with Proceedings-Entry):

R. Neumayer, A. Rauber:
"Multi-modal music information retrieval - visualisation and evaluation of clusterings by both audio and lyrics";
Talk: RIAO 2007, Pittsburgh, PA, USA; 2007-05-29 - 2007-06-01; in: "Proceedings of the 8th Conference Recherche d'Information Assistée par Ordinateur (RIAO'07)", (2007), 20 pages.



English abstract:
Navigation in and access to the contents of digital audio archives have become increasingly important topics in Information Retrieval. Both private and commercial music collections are growing both in terms of size and acceptance in the user community. Content based approaches relying on signal processing techniques have been used in Music Information Retrieval for some time to represent the acoustic characteristics of pieces of music, which may be used for collection organisation or retrieval tasks. However, music is not defined by acoustic characteristics only, but also, sometimes even to a large degree, by its contents in terms of lyrics. A song´s lyrics provide more information to search for or may be more representative of specific musical genres than the acoustic content, e.g. `love songs´ or `Christmas carols´. We therefore suggest an improved indexing of audio files by two modalities. Combinations of audio features and song lyrics can be used to organise audio collections and to display them via map based interfaces. Specifically, we use Self-Organising Maps as visualisation and interface metaphor. Separate maps are created and linked to provide a multi-modal view of an audio collection. Moreover, we introduce quality measures for quantitative validation of cluster spreads across the resulting multiple topographic mappings provided by the Self-Organising Maps.

German abstract:
Navigation in and access to the contents of digital audio archives have become increasingly important topics in Information Retrieval. Both private and commercial music collections are growing both in terms of size and acceptance in the user community. Content based approaches relying on signal processing techniques have been used in Music Information Retrieval for some time to represent the acoustic characteristics of pieces of music, which may be used for collection organisation or retrieval tasks. However, music is not defined by acoustic characteristics only, but also, sometimes even to a large degree, by its contents in terms of lyrics. A song´s lyrics provide more information to search for or may be more representative of specific musical genres than the acoustic content, e.g. `love songs´ or `Christmas carols´. We therefore suggest an improved indexing of audio files by two modalities. Combinations of audio features and song lyrics can be used to organise audio collections and to display them via map based interfaces. Specifically, we use Self-Organising Maps as visualisation and interface metaphor. Separate maps are created and linked to provide a multi-modal view of an audio collection. Moreover, we introduce quality measures for quantitative validation of cluster spreads across the resulting multiple topographic mappings provided by the Self-Organising Maps.

Keywords:
music information retrieval, multi-modality, clustering, self-organising map, text analysis, visualisation


Electronic version of the publication:
http://publik.tuwien.ac.at/files/pub-inf_4728.pdf


Created from the Publication Database of the Vienna University of Technology.