Abstract
Results of experiments regarding lip gesture recognition with an artificial neural network are discussed. The neural network module forms the core element of a multimodal human-computer interface called LipMouse. This solution allows a user to work on a computer using lip movements and gestures. A user face is detected in a video stream from a standard web camera using a cascade of boosted classifiers working with Haar-like features. Lip region extraction is based on a lip shape approximation calculated by the means of lip image segmentation using fuzzy clustering. ANN is fed with a feature vector describing lip region appearance. The descriptors used include a luminance histogram, statistical moments and co-occurrence matrices statistical parameters. ANN is able to recognize with a good accuracy three lip gestures: mouth opening, sticking out the tongue and forming puckered lips.
Authors (2)
Cite as
Full text
full text is not available in portal
Keywords
Details
- Category:
- Articles
- Type:
- artykuły w czasopismach recenzowanych i innych wydawnictwach ciągłych
- Language:
- English
- Publication year:
- 2010
- Bibliographic description:
- Dalka P., Czyżewski A.: Controlling computer by lip gestures employing neural network// Lecture notes in artificial intelligence.. -., nr. Nr 6086 (2010), s.80-89
- Verified by:
- Gdańsk University of Technology
seen 107 times