Aya Ichinose, Shuichi Kurabayashi, and Yasushi Kiyoki
Music course-ware, multimedia information systems, technology for education, e-learning
This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.
Important Links:
Go Back