Search results for: FACIAL EXPRESSIONS RECOGNITION
-
Emotion Recognition Based on Facial Expressions of Gamers
PublicationThis article presents an approach to emotion recognition based on facial expressions of gamers. With application of certain methods crucial features of an analysed face like eyebrows' shape, eyes and mouth width, height were extracted. Afterwards a group of artificial intelligence methods was applied to classify a given feature set as one of the following emotions: happiness, sadness, anger and fear.The approach presented in this...
-
Emotion Recognition Based on Facial Expressions of Gamers
PublicationThis article presents an approach to emotion recognition based on facial expressions of gamers. With application of certain methods crucial features of an analyzed face like eyebrows' shape, eyes and mouth width, height were extracted. Afterwards a group of artificial intelligence methods was applied to classify a given feature set as one of the following emotions: happiness, sadness, anger and fear. The approach presented in this...
-
Limitations of Emotion Recognition from Facial Expressions in e-Learning Context
PublicationThe paper concerns technology of automatic emotion recognition applied in e-learning environment. During a study of e-learning process the authors applied facial expressions observation via multiple video cameras. Preliminary analysis of the facial expressions using automatic emotion recognition tools revealed several unexpected results, including unavailability of recognition due to face coverage and significant inconsistency...
-
Acquisition and indexing of RGB-D recordings for facial expressions and emotion recognition
PublicationIn this paper KinectRecorder comprehensive tool is described which provides for convenient and fast acquisition, indexing and storing of RGB-D video streams from Microsoft Kinect sensor. The application is especially useful as a supporting tool for creation of fully indexed databases of facial expressions and emotions that can be further used for learning and testing of emotion recognition algorithms for affect-aware applications....
-
Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions
PublicationThe paper concerns accuracy of emotion recognition from facial expressions. As there are a couple of ready off-the-shelf solutions available in the market today, this study aims at practical evaluation of selected solutions in order to provide some insight into what potential buyers might expect. Two solutions were compared: FaceReader by Noldus and Xpress Engine by QuantumLab. The performed evaluation revealed that the recognition...
-
FEEDB: A multimodal database of facial expressions and emotions
PublicationIn this paper a first version of a multimodal FEEDB database of facial expressions and emotions is presented. The database contains labeled RGB-D recordings of people expressing a specific set of expressions that have been recorded using Microsoft Kinect sensor. Such a database can be used for classifier training and testing in face recognition as well as in recognition of facial expressions and human emotions. Also initial experiences...
-
On Facial Expressions and Emotions RGB-D Database
PublicationThe goal of this paper is to present the idea of creating reference database of RGB-D video recordings for recognition of facial expressions and emotions. Two different formats of the recordings used for creation of two versions of the database are described and compared using different criteria. Examples of first applications using databases are also presented to evaluate their usefulness.
-
An unorthodox view on the problem of tracking facial expressions
PublicationRecent developments in imaging cameras has opened a new way of analyzing facial expression. We would like to take advantage from this new technology and present a method of imaging and processing images of human face as a response to the particular stimuli. The response in this case is represented by the facial expressions and the stimuli are still images representing six basic emotions according to Eckmann. Working hypothesis...
-
An unorthodox view on the problem of tracking facial expressions
PublicationRecent developments in imaging cameras has opened a new way of analyzing facial expression. We would like to take advantage from this new technology and present a method of imaging and processing images of human face as a response to the particular stimuli. The response in this case is represented by the facial expressions and the stimuli are still images representing six basic emotions according to Eckmann. Working hypothesis...
-
An unorthodox view on the problem of tracking facial expressions
Publication -
An extension to the FEEDB Multimodal Database of Facial Expressions and Emotions
PublicationFEEDB is a multimodal database that contains recordings of people expressing different emotions, captured by using a Microsoft Kinect sensor. Data were originally provided in the device’s proprietary format (XED), requiring both the Microsoft Kinect Studio application and a Kinect sensor attached to the system to use the files. In this paper, we present an extension of the database. For a selection of recordings, we also provide...
-
Robot Eye Perspective in Perceiving Facial Expressions in Interaction with Children with Autism
PublicationThe paper concerns automatic facial expression analysis applied in a study of natural “in the wild” interaction between children with autism and a social robot. The paper reports a study that analyzed the recordings captured via a camera located in the eye of a robot. Children with autism exhibit a diverse level of deficits, including ones in social interaction and emotional expression. The aim of the study was to explore the possibility...
-
Facial emotion recognition using depth data
PublicationIn this paper an original approach is presented for facial expression and emotion recognition based only on depth channel from Microsoft Kinect sensor. The emotional user model contains nine emotions including the neutral one. The proposed recognition algorithm uses local movements detection within the face area in order to recognize actual facial expression. This approach has been validated on Facial Expressions and Emotions Database...
-
Image-based Analysis of Emotional Facial Expressions in Full Face Transplants
Publication -
ID 382 – Electrophysiological evaluation of emotional expressions in the facial transplantation patients
Publication -
Evaluation of Knowledge-Based Recognition of Spatial Expressions for Polish
Publication -
Database of speech and facial expressions recorded with optimized face motion capture settings
PublicationThe broad objective of the present research is the analysis of spoken English employing a multiplicity of modalities. An important stage of this process, discussed in the paper, is creating a database of speech accompanied with facial expressions. Recordings of speakers were made using an advanced system for capturing facial muscle motion. A brief historical outline, current applications, limitations and the ways of capturing face...
-
Identification of Emotions Based on Human Facial Expressions Using a Color-Space Approach
PublicationHCI technology improves human-computer interaction. Such communication can be carried out with the use of emotions that are visible on the human face since birth. In this paper the Emotion system for detecting and recognizing facial expressions, developed in the MSc work, is presented. The system recognizes emotion from webcam video in real time. It is based on color segmentation and morphological operations. The system uses a...
-
Bimodal Emotion Recognition Based on Vocal and Facial Features
PublicationEmotion recognition is a crucial aspect of human communication, with applications in fields such as psychology, education, and healthcare. Identifying emotions accurately is challenging, as people use a variety of signals to express and perceive emotions. In this study, we address the problem of multimodal emotion recognition using both audio and video signals, to develop a robust and reliable system that can recognize emotions...
-
Preliminary Study on Automatic Recognition of Spatial Expressions in Polish Texts
Publication -
Recovery of facial expressions using functional electrical stimulation after full-face transplantation
Publication -
Brain-Inspired Deep Networks for Facial Expression Recognition. Frontiers in Biomedical Technologies
Publication -
Enantiomeric self-recognition of a facial amphiphile triggered by [{Pd(ONO2)(en)}2]
PublicationOpisano syntezę pochodnych glikolurylu tworzących dimeryczne struktury w roztworze wodnym prowadzące do samorozpoznania enancjomerycznego. Struktury te uzyskano dzięki koordynacji do [{Pd(ONO2)(en)}2].
-
Emotion Recognition for Affect Aware Video Games
PublicationIn this paper the idea of affect aware video games is presented. A brief review of automatic multimodal affect recognition of facial expressions and emotions is given. The first result of emotions recognition using depth data as well as prototype affect aware video game are presented
-
DevEmo—Software Developers’ Facial Expression Dataset
PublicationThe COVID-19 pandemic has increased the relevance of remote activities and digital tools for education, work, and other aspects of daily life. This reality has highlighted the need for emotion recognition technology to better understand the emotions of computer users and provide support in remote environments. Emotion recognition can play a critical role in improving the remote experience and ensuring that individuals are able...
-
Automatic Emotion Recognition in Children with Autism: A Systematic Literature Review
PublicationThe automatic emotion recognition domain brings new methods and technologies that might be used to enhance therapy of children with autism. The paper aims at the exploration of methods and tools used to recognize emotions in children. It presents a literature review study that was performed using a systematic approach and PRISMA methodology for reporting quantitative and qualitative results. Diverse observation channels and modalities...
-
Influence of Thermal Imagery Resolution on Accuracy of Deep Learning based Face Recognition
PublicationHuman-system interactions frequently require a retrieval of the key context information about the user and the environment. Image processing techniques have been widely applied in this area, providing details about recognized objects, people and actions. Considering remote diagnostics solutions, e.g. non-contact vital signs estimation and smart home monitoring systems that utilize person’s identity, security is a very important factor....
-
IMAGE CORRELATION AS A TOLL FOR TRACKING FACIAL CHANGES CAUSING BY EXTERNAL STIMULI
PublicationExpressions of the human face bring a lot of information, which are a valuable source in the areas of computer vision, remote sensing and affective computing. For years, by analyzing the movement of the skin and facial muscles scientists are trying to create the perfect tool, based on image analysis, allowing the recognition of emotional states of human beings. To create a reliable algorithm, it is necessary to explore and examine...
-
Detection of Face Position and Orientation Using Depth Data
PublicationIn this paper an original approach is presented for real-time detection of user's face position and orientation based only on depth channel from a Microsoft Kinect sensor which can be used in facial analysis on scenes with poor lighting conditions where traditional algorithms based on optical channel may have failed. Thus the proposed approach can support, or even replace, algorithms based on optical channel or based on skeleton...
-
Emotion monitoring system for drivers
PublicationThis article describes a new approach to the issue of building a driver monitoring system. Actual systems focus, for example, on tracking eyelid and eyebrow movements that result from fatigue. We propose a different approach based on monitoring the state of emotions. Such a system assumes that by using the emotion model based on our own concept, referred to as the reverse Plutchik’s paraboloid of emotions, the recognition of emotions...
-
Using Physiological Signals for Emotion Recognition
PublicationRecognizing user’s emotions is the promising area of research in a field of human-computer interaction. It is possible to recognize emotions using facial expression, audio signals, body poses, gestures etc. but physiological signals are very useful in this field because they are spontaneous and not controllable. In this paper a problem of using physiological signals for emotion recognition is presented. The kinds of physiological...
-
WEB-CAM AS A MEANS OF INFORMATION ABOUT EMOTIONAL ATTEMPT OF STUDENTS IN THE PROCESS OF DISTANT LEARNING
PublicationNew methods in education become more popular nowadays. Distant learning is a good example when teacher and student meet in virtual environment. Because interaction in this virtual world might be complicated it seems necessary to assure as much methods of conforming that student is still engaged in the process of learning as it is possible. We would like to present assumption that by means of web-cam we will be able to track facial...
-
Mining inconsistent emotion recognition results with the multidimensional model
PublicationThe paper deals with the challenge of inconsistency in multichannel emotion recognition. The focus of the paper is to explore factors that might influence the inconsistency. The paper reports an experiment that used multi-camera facial expression analysis with multiple recognition systems. The data were analyzed using a multidimensional approach and data mining techniques. The study allowed us to explore camera location, occlusions...
-
Przegląd robotów humanoidalnych
PublicationW artykule przedstawiono przegląd najpopularniejszych robotów humanoidalnych, wyróżniając ich ważniejsze cechy i porównując podstawowe charakterystyki, biorąc przy tym pod uwagę pożądane kognitywne aspekty rozwoju robotyki. Wśród osiągalnych cech rozmaitych rozwiązań aparatów humanoidalnych dostępnych na rynku – wyróżnia się głównie liczbę stopni swobody, rodzaj zastosowanego układu lokomocji oraz możliwości wyrażania mimiki twarzy,...
-
Evaluation Criteria for Affect-Annotated Databases
PublicationIn this paper a set of comprehensive evaluation criteria for affect-annotated databases is proposed. These criteria can be used for evaluation of the quality of a database on the stage of its creation as well as for evaluation and comparison of existing databases. The usefulness of these criteria is demonstrated on several databases selected from affect computing domain. The databases contain different kind of data: video or still...
-
Biometryczna kontrola dostępu
PublicationOpisano szczegółowo algorytm detekcji oraz identyfikacji człowieka na podstawie punktów nodalnych twarzy. Zdefiniowano pojęcia: biometria, proces pomiaru biometrycznego, metody biometrycznej identyfikacji oraz kontrola dostępu. Przedstawiono opis opracowanego systemu biometrycznej identyfikacji wykorzystującego sztuczne sieci neuronowe. Podano wyniki badań oraz przeprowadzono ich wnikliwą dyskusję.Biometrics is the study of automated...
-
Selection of Features for Multimodal Vocalic Segments Classification
PublicationEnglish speech recognition experiments are presented employing both: audio signal and Facial Motion Capture (FMC) recordings. The principal aim of the study was to evaluate the influence of feature vector dimension reduction for the accuracy of vocalic segments classification employing neural networks. Several parameter reduction strategies were adopted, namely: Extremely Randomized Trees, Principal Component Analysis and Recursive...
-
AffecTube — Chrome extension for YouTube video affective annotations
PublicationThe shortage of emotion-annotated video datasets suitable for training and validating machine learning models for facial expression-based emotion recognition stems primarily from the significant effort and cost required for manual annotation. In this paper, we present AffecTube as a comprehensive solution that leverages crowdsourcing to annotate videos directly on the YouTube platform, resulting in ready-to-use emotion-annotated...
-
Fully Automated AI-powered Contactless Cough Detection based on Pixel Value Dynamics Occurring within Facial Regions
PublicationIncreased interest in non-contact evaluation of the health state has led to higher expectations for delivering automated and reliable solutions that can be conveniently used during daily activities. Although some solutions for cough detection exist, they suffer from a series of limitations. Some of them rely on gesture or body pose recognition, which might not be possible in cases of occlusions, closer camera distances or impediments...
-
WYKORZYSTANIE SIECI NEURONOWYCH DO SYNTEZY MOWY WYRAŻAJĄCEJ EMOCJE
PublicationW niniejszym artykule przedstawiono analizę rozwiązań do rozpoznawania emocji opartych na mowie i możliwości ich wykorzystania w syntezie mowy z emocjami, wykorzystując do tego celu sieci neuronowe. Przedstawiono aktualne rozwiązania dotyczące rozpoznawania emocji w mowie i metod syntezy mowy za pomocą sieci neuronowych. Obecnie obserwuje się znaczny wzrost zainteresowania i wykorzystania uczenia głębokiego w aplikacjach związanych...
-
Remote Estimation of Video-Based Vital Signs in Emotion Invocation Studies
PublicationAbstract— The goal of this study is to examine the influence of various imitated and video invoked emotions on the vital signs (respiratory and pulse rates). We also perform an analysis of the possibility to extract signals from sequences acquired with cost-effective cameras. The preliminary results show that the respiratory rate allows for better separation of some emotions than the pulse rate, yet this relation highly depends...
-
Thermal Images Analysis Methods using Deep Learning Techniques for the Needs of Remote Medical Diagnostics
PublicationRemote medical diagnostic solutions have recently gained more importance due to global demographic shifts and play a key role in evaluation of health status during epidemic. Contactless estimation of vital signs with image processing techniques is especially important since it allows for obtaining health status without the use of additional sensors. Thermography enables us to reveal additional details, imperceptible in images acquired...
-
LDNet: A Robust Hybrid Approach for Lie Detection Using Deep Learning Techniques
PublicationDeception detection is regarded as a concern for everyone in their daily lives and affects social interactions. The human face is a rich source of data that offers trustworthy markers of deception. The deception or lie detection systems are non-intrusive, cost-effective, and mobile by identifying facial expressions. Over the last decade, numerous studies have been conducted on deception detection using several advanced techniques....
-
Normalization of face illumination using basic knowledge and information extracted from a single image
PublicationThis paper presents a method for face image normalization that can be applied to the extraction of illumination invariant facial features or used to remove bad lighting effects and produce high-quality, photorealistic results. Most of the existing approaches concentrate on separating the constant albedo from the variable light intensity; that concept, however, is based on the Lambertian model, which fails in the presence of specularities...
-
Expressions maghrébines
Journals -
Facial Feature extraction from a monochromatic picture
PublicationFace pose determination represents an important area of research in Human Machine Interaction. In this paper, I describe a new method of extracting facial feature locations from a single monochromatic monocular camera for the purpose of estimating and tracking the three dimensional pose of human face and eye-gaze direction.
-
Facial feature extraction from a monochromatic picture.
PublicationFace pose determination represents an important area of research in Human Machine Interaction. In this paper, I describe a new method of extracting facial feature locations from a single monochromatic monocular camera for the purpose of estimating and tracking the three dimensional pose of human face and eye-gaze direction.
-
Facial features extraction for color, frontal images
PublicationThe problem of facial characteristic features extraction is discussed. Several methods of features extraction for color en--face photographs are discussed. The methods are based mainly on the colors features related to the specific regions of the human face. The usefulness of presented methods was tested on a database of en--face photographs consisting of 100 photographs.
-
Real-time facial feature tracking in poor quality thermal imagery
PublicationRecently, facial feature tracking systems have become more and more popular because of many possible use cases. Especially in medical applications location of the face and facial features are very useful. Many researches have presented methods to detect and track facial features in visible light. However, facial feature analysis in thermography may also be very advantageous. Some examples of using infrared imagery in medicine include...
-
Driver fatigue detection method based on facial image analysis
PublicationNowadays, ensuring road safety is a crucial issue that demands continuous development and measures to minimize the risk of accidents. This paper presents the development of a driver fatigue detection method based on the analysis of facial images. To monitor the driver's condition in real-time, a video camera was used. The method of detection is based on analyzing facial features related to the mouth area and eyes, such as...