Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition - Publication - Bridge of Knowledge

Search

Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition

Abstract

The multimodal human-computer interface (HCI) called LipMouse is presented, allowing a user to work on a computer using movements and gestures made with his/her mouth only. Algorithms for lip movement tracking and lip gesture recognition are presented in details. User face images are captured with a standard webcam. Face detection is based on a cascade of boosted classifiers using Haar-like features. A mouth region is located in the lower part of the face region. Its position is used to track lip movements that allows a user to control a screen cursor. Three lip gestures are recognized: mouth opening, sticking out the tongue and forming puckered lips. Lip gesture recognition is performed by an artificial neural network and utilizes various image features of the lip region. An accurate lip shape is obtained by the means of lip image segmentation using fuzzy clustering.

Cite as

Full text

full text is not available in portal

Keywords

Details

Category:
Articles
Type:
artykuły w czasopismach recenzowanych i innych wydawnictwach ciągłych
Published in:
International Journal of Computing Science and Mathematics no. 7, pages 124 - 139,
ISSN: 1752-5055
Language:
English
Publication year:
2010
Bibliographic description:
Dalka P., Czyżewski A.: Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition// International Journal of Computing Science and Mathematics. -Vol. 7., iss. 3 (2010), s.124-139
Verified by:
Gdańsk University of Technology

seen 198 times

Recommended for you

Meta Tags