Popular music representation: chorus detection & emotion recognition

Chia Hung Yeh, Wen Yu Tseng, Chia Yen Chen*, Yu Dun Lin, Yi Ren Tsai, Hsuan I. Bi, Yu Ching Lin, Ho Yi Lin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This paper proposes a popular music representation strategy based on the song’s emotion. First, a piece of popular music is decomposed into chorus and verse segments through the proposed chorus detection algorithm. Three descriptive features: intensity, frequency band and rhythm regularity are extracted from the structured segments for emotion detection. A hierarchical Adaboost classifier is employed to recognize the emotion of a piece of popular music. The general emotion of the music is classified according to Thayer’s model into four emotions: happy, angry, depressed and relaxed. Experiments conducted on a 350-popular-music database show the average recall and precision of our proposed chorus detection are approximately 95 % and 84 %, respectively; and the average precision rate of emotion detection is 92 %. Additional tests are performed on songs with cover versions in different lyrics and languages, and the resultant precision rate is 90 %. The proposes approaches have been tested and proven by the professional online music company, KKBOX Inc. and show promising performance for effectively and efficiently identifying the emotions of a variety of popular music.

Original languageEnglish
Pages (from-to)2103-2128
Number of pages26
JournalMultimedia Tools and Applications
Volume73
Issue number3
DOIs
Publication statusPublished - 2014 Oct 29
Externally publishedYes

Keywords

  • Adaboost
  • Chorus
  • Emotion
  • MFCCs
  • Popular music
  • Rhythm
  • Verse

ASJC Scopus subject areas

  • Software
  • Media Technology
  • Hardware and Architecture
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Popular music representation: chorus detection & emotion recognition'. Together they form a unique fingerprint.

Cite this