Popular music analysis: Chorus and emotion Detection

Chia Hung Yeh*, Yu Dun Lin, Ming Sui Lee, Wen Yu Tseng

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

4 Citations (Scopus)

Abstract

In this paper, a chorus detection and an emotion detection algorithm for popular music are proposed. First, a popular music is decomposed into chorus and verse segments based on its color representation and MFCCs (Mel-frequency cepstral coefficients). Four features including intensity, tempo and rhythm regularity are extracted from these structured segments for emotion detection. The emotion of a song is classified into four classes of emotions: happy, angry, depressed and relaxed via a back-propagation neural network classifier. Experimental results show that the average recall and precision of the proposed chorus detection are approximated to 95% and 84%, respectively; the average precision rate of emotion detection is 88.3% for a test database consisting of 210 popular music songs.

Original languageEnglish
Pages907-910
Number of pages4
Publication statusPublished - 2010
Externally publishedYes
Event2nd Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2010 - Biopolis, Singapore
Duration: 2010 Dec 142010 Dec 17

Other

Other2nd Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2010
Country/TerritorySingapore
CityBiopolis
Period2010/12/142010/12/17

Keywords

  • Chorus
  • MFCC
  • Music emotion
  • Neural network

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems

Fingerprint

Dive into the research topics of 'Popular music analysis: Chorus and emotion Detection'. Together they form a unique fingerprint.

Cite this