|AM07||START Conference Manager|
Emotion is one of the core factors in Music Information Retrieval (MIR) as recognized by several researchers. However, few or no attempts have been conducted to apply this factor to MIR systems in the Web 2.0 environment. Based on the necessities mentioned above, our study investigates how to embody the MIR system that reflects unique user-assigned, emotion-based music metadata, which is different from traditional bibliographic information, in the Web environment. This study investigates how the sliding bar interface of the Glass Engine can be modified and used more effectively by music information retrieval system users. We will determine whether patterns emerge from the participants’ collective responses. If emotional reactions from the sample show any correlation, we may be able to deduce whether music evokes similar responses among individuals. We would like to investigate whether the collected bar pattern matches the music users’ searching needs. We may see if the collected emotional patterns will be easily usable as well as meet the music user’s emotional information needs in the Web 2.0 environment. This study may contribute to a social computing approach to MIR and, specifically, emotion-based faceted access to music. Using this system, a user could locate a music piece such as “Cheerful but calm work by Franz Peter Schubert”.
|START Conference Manager (V2.54.4)|