Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition - Publikacja - MOST Wiedzy

Wyszukiwarka

Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition

Abstrakt

The multimodal human-computer interface (HCI) called LipMouse is presented, allowing a user to work on a computer using movements and gestures made with his/her mouth only. Algorithms for lip movement tracking and lip gesture recognition are presented in details. User face images are captured with a standard webcam. Face detection is based on a cascade of boosted classifiers using Haar-like features. A mouth region is located in the lower part of the face region. Its position is used to track lip movements that allows a user to control a screen cursor. Three lip gestures are recognized: mouth opening, sticking out the tongue and forming puckered lips. Lip gesture recognition is performed by an artificial neural network and utilizes various image features of the lip region. An accurate lip shape is obtained by the means of lip image segmentation using fuzzy clustering.

Informacje szczegółowe

Kategoria:
Publikacja w czasopiśmie
Typ:
artykuły w czasopismach recenzowanych i innych wydawnictwach ciągłych
Opublikowano w:
RED. ZAGR. ANGIELSKI nr 7, strony 124 - 139,
ISSN:
Język:
angielski
Rok wydania:
2010
Opis bibliograficzny:
Dalka P., Czyżewski A.: Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition// RED. ZAGR. ANGIELSKI. -Vol. 7., nr. Nr 3 (2010), s.124-139
Weryfikacja:
Politechnika Gdańska

wyświetlono 23 razy

Publikacje, które mogą cię zainteresować

Meta Tagi