An Automatic Metadata Generation Method for Music Retrieval-by-impression Dealing with Impression-transition

A. Ijichi and Y. Kiyoki (Japan)

Keywords

music database, metadata, impression, story 1:

Abstract

In the design of multimedia database systems, one of the most important issues is how to deal with "Kansei" and "impression" in human beings. The concept of "Kansei" and "impression" includes several meanings on sensitive recognition, such as human senses, feelings, sensitivity and psychological reactions. In this paper, we propose an automatic metadata-generation method for extracting im pressions of music, such as "agitated," "joyous," "lyrical," "melancholy," and "sentimental," for semantically retriev ing music data according to human's impression. We also present an impression-metadata-generation mechanism for reflecting impression transition occurring as time passes, that is, as temporal transition of a story in music (mu sic-story). This mechanism is used to compute the impres sion-strength reflecting the impression transition, that is, "impression-stream" as a temporal transition of a mu sic-story. Our automatic metadata-generation for a mu sic-story consists of the following processes: (1) Division of a music-story into sections (2) Impression-metadata extraction for each section (3) Computation of impres sion-strength of impression-metadata (4) Weighting im pression-metadata according to impression-strength (5) Combining impression-metadata for adjusting themselves to a query structure. Music data with a story consists of several sections, and each section gives an individual im pression. The combination of sections gives a global im pression of music data. Our metadata-generation method computes correlations between music data and impression words by reflecting the degree of changes of impressions among continuous sections. This paper shows several ex perimental results of metadata generation to clarify the feasibility and effectiveness of our method.

Important Links:



Go Back