Virtual piano design via single-view video based on multifinger actions recognition

Chia Hung Yeh*, Wen Yu Tseng, Jia Chi Bai, Ruey Nan Yeh, Sun Chen Wang, Po Yi Sung

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

As cameras become cheaper and gain popularity, vision-based input devices are highly desired and may now become a feasible solution. In this paper, a vision-based virtual piano mechanism is proposed. By tacking and analyzing the motion of the fingertips, the system can detect keystrokes and play the corresponding note on a seven-key virtual piano. Experimental results show the precision and efficiency of the proposed scheme.

Original languageEnglish
Title of host publication2010 3rd International Conference on Human-Centric Computing, HumanCom 2010
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2010 3rd International Conference on Human-Centric Computing, HumanCom 2010 - Cebu, Philippines
Duration: 2010 Aug 112010 Aug 13

Publication series

Name2010 3rd International Conference on Human-Centric Computing, HumanCom 2010

Conference

Conference2010 3rd International Conference on Human-Centric Computing, HumanCom 2010
Country/TerritoryPhilippines
CityCebu
Period2010/08/112010/08/13

Keywords

  • Computer vision
  • Fingertip detection
  • Keystroke detection
  • Virtual piano

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Virtual piano design via single-view video based on multifinger actions recognition'. Together they form a unique fingerprint.

Cite this