Search results for: EMOTION RECOGNITION
-
Automatic recognition of males and females among web browser users based on behavioural patterns of peripherals usage
PublicationPurpose The purpose of this paper is to answer the question whether it is possible to recognise the gender of a web browser user on the basis of keystroke dynamics and mouse movements. Design/methodology/approach An experiment was organised in order to track mouse and keyboard usage using a special web browser plug-in. After collecting the data, a number of parameters describing the users’ keystrokes, mouse movements and clicks...
-
MACHINE LEARNING APPLICATIONS IN RECOGNIZING HUMAN EMOTIONS BASED ON THE EEG
PublicationThis study examined the machine learning-based approach allowing the recognition of human emotional states with the use of EEG signals. After a short introduction to the fundamentals of electroencephalography and neural oscillations, the two-dimensional valence-arousal Russell’s model of emotion was described. Next, we present the assumptions of the performed EEG experiment. Detail aspects of the data sanitization including preprocessing,...
-
WYKORZYSTANIE SIECI NEURONOWYCH DO SYNTEZY MOWY WYRAŻAJĄCEJ EMOCJE
PublicationW niniejszym artykule przedstawiono analizę rozwiązań do rozpoznawania emocji opartych na mowie i możliwości ich wykorzystania w syntezie mowy z emocjami, wykorzystując do tego celu sieci neuronowe. Przedstawiono aktualne rozwiązania dotyczące rozpoznawania emocji w mowie i metod syntezy mowy za pomocą sieci neuronowych. Obecnie obserwuje się znaczny wzrost zainteresowania i wykorzystania uczenia głębokiego w aplikacjach związanych...
-
Analysis of human behavioral patterns
PublicationWidespread usage of Internet and mobile devices entailed growing requirements concerning security which in turn brought about development of biometric methods. However, a specially designed biometric system may infer more about users than just verifying their identity. Proper analysis of users’ characteristics may also tell much about their skills, preferences, feelings. This chapter presents biometric methods applied in several...
-
Discovering Rule-Based Learning Systems for the Purpose of Music Analysis
PublicationMusic analysis and processing aims at understanding information retrieved from music (Music Information Retrieval). For the purpose of music data mining, machine learning (ML) methods or statistical approach are employed. Their primary task is recognition of musical instrument sounds, music genre or emotion contained in music, identification of audio, assessment of audio content, etc. In terms of computational approach, music databases...
-
EMBOA - affective loop in Socially Assistive Robotics as an intervention tool for children with autism
e-Learning CoursesThe aim of the training course "Intensive programmes for higher education learner" within the EMBOA project is to familiarise participants with the use of social robots as an intervention tool for children with autism, emotion recognition and the combination of both methods. Students will be informed about the guidelines and results of the project.
-
Music Mood Visualization Using Self-Organizing Maps
PublicationDue to an increasing amount of music being made available in digital form in the Internet, an automatic organization of music is sought. The paper presents an approach to graphical representation of mood of songs based on Self-Organizing Maps. Parameters describing mood of music are proposed and calculated and then analyzed employing correlation with mood dimensions based on the Multidimensional Scaling. A map is created in which...
-
Affect-awareness framework for intelligent tutoring systems
PublicationThe paper proposes a framework for construction of Intelligent Tutoring Systems (ITS), that take into consideration student emotional states and make affective interventions. The paper provides definitions of `affect-aware systems' and `affective interventions' and describes the concept of the affect-awareness framework. The proposed framework separates emotion recognition from its definition, processing and making decisions on...
-
Methodology of Affective Intervention Design for Intelligent Systems
PublicationThis paper concerns how intelligent systems should be designed to make adequate, valuable and natural affective interventions. The article proposes a process for choosing an affective intervention model for an intelligent system. The process consists of 10 activities that allow for step-by-step design of an affective feedback loop and takes into account the following factors: expected and desired emotional states, characteristics...
-
Robot-Based Intervention for Children With Autism Spectrum Disorder: A Systematic Literature Review
PublicationChildren with autism spectrum disorder (ASD) have deficits in the socio-communicative domain and frequently face severe difficulties in the recognition and expression of emotions. Existing literature suggested that children with ASD benefit from robot-based interventions. However, studies varied considerably in participant characteristics, applied robots, and trained skills. Here, we reviewed robot-based interventions targeting...
-
Detection of Face Position and Orientation Using Depth Data
PublicationIn this paper an original approach is presented for real-time detection of user's face position and orientation based only on depth channel from a Microsoft Kinect sensor which can be used in facial analysis on scenes with poor lighting conditions where traditional algorithms based on optical channel may have failed. Thus the proposed approach can support, or even replace, algorithms based on optical channel or based on skeleton...
-
Agnieszka Landowska dr hab. inż.
PeopleAgnieszka Landowska works for Gdansk University of Technology, FETI, Department of Software Engineering. Her research concentrates on usability, accessibility and technology adoption, as well as affective computing methods. She initiated Emotions in HCI Research Group and conducts resarch on User eXperiene evaluation of applications and other technologies.
-
Emotions in polish speech recordings
Open Research DataThe data set presents emotions recorded in sound files that are expressions of Polish speech. Statements were made by people aged 21-23, young voices of 5 men. Each person said the following words / nie – no, oddaj - give back, podaj – pass, stop - stop, tak - yes, trzymaj -hold / five times representing a specific emotion - one of three - anger (a),...
-
Identification of Emotional States Using Phantom Miro M310 Camera
PublicationThe purpose of this paper is to present the possibilities associated with the use of remote sensing methods in identifying human emotional states, and to present the results of the research conducted by the authors in this field. The studies presented involved the use of advanced image analysis to identify areas on the human face that change their activity along with emotional expression. Most of the research carried out in laboratories...
-
IMAGE CORRELATION AS A TOLL FOR TRACKING FACIAL CHANGES CAUSING BY EXTERNAL STIMULI
PublicationExpressions of the human face bring a lot of information, which are a valuable source in the areas of computer vision, remote sensing and affective computing. For years, by analyzing the movement of the skin and facial muscles scientists are trying to create the perfect tool, based on image analysis, allowing the recognition of emotional states of human beings. To create a reliable algorithm, it is necessary to explore and examine...
-
A Concept of Automatic Film Color Grading Based on Music Recognition and Evoked Emotions
PublicationThe article presents the aspects of the final selection of the color of shots in film production based on the psychology of color. First of all, the elements of color processing, contrast, saturation or white balance in the film shots were presented and the definition of color grading was given. In the second part of the article the analysis of film music was conducted in the context of stimulating appropriate emotions while watching...
-
Michał Czubenko dr inż.
PeopleMichał Czubenko is a distinguished 2009 graduate of the Faculty of Electronics, Telecommunications, and Informatics at Gdańsk University of Technology, specializing in the discipline of automatic control and robotics. Currently, he serves as an adjunct in the Department of Robotics and Decision Systems at the same institution. In 2012, he embarked on a three-month internship at Kingston University London, broadening his horizons...
-
Neural oscillations induced by IAPS pictures (session 24)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 23)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 25)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 26)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Keystroke Dynamics Patterns While Writing Positive and Negative Opinions
PublicationThis paper deals with analysis of behavioural patterns in human–computer interaction. In the study, keystroke dynamics were analysed while participants were writing positive and negative opinions. A semi-experiment with 50 participants was performed. The participants were asked to recall the most negative and positive learning experiences (subject and teacher) and write an opinion about it. Keystroke dynamics were captured and...
-
Emotion Monitoring – Verification of Physiological Characteristics Measurement Procedures
PublicationThis paper concerns measurement procedures on an emotion monitoring stand designed for tracking human emotions in the Human-Computer Interaction with physiological characteristics. The paper addresses the key problem of physiological measurements being disturbed by a motion typical for human-computer interaction such as keyboard typing or mouse movements. An original experiment...
-
How to Design Affect-aware Educational Systems – the AFFINT Process Approach
PublicationComputer systems, that support learning processes, can adapt to the needs and states of a learner. The adaptation might directly address the knowledge deficits and most tutoring systems apply an adaptable learning path of that kind. Apart from a preliminary knowledge state, there are more factors, that influence education effectiveness and among those there are fluctuating emotional states. The tutoring systems may recognize or...
-
Evaluation of affective intervention process in development of affect-aware educational video games
PublicationIn this paper initial experiences are presented on implementing specific methodology of affective intervention design (AFFINT) for development of affect-aware educational video games. In the described experiment, 10 student teams are to develop affect-aware educational video games using AFFINT to formalize the whole process. Although all projects are still in progress, first observations and conclusions may already be presented.
-
Emotion Monitor - Concept, Construction and Lessons Learned
PublicationThis paper concerns the design and physical construction of an emotion monitor stand for tracking human emotions in Human-Computer Interaction using multi-modal approach. The concept of the stand using cameras, behavioral analysis tools and a set of physiological sensors such as galvanic skin response, blood-volume pulse, temperature, breath and electromyography is presented and followed...
-
Modeling emotions for affect-aware applications
PublicationThe chapter concerns emotional states representation and modeling for software systems, that deal with human affect. A review of emotion representation models is provided, including discrete, dimensional and componential models. The paper provides also analysis of emotion models used in diverse types of affect-aware applications: games, mood trackers or tutoring systems. The analysis is supported with two design cases. The study...
-
Wykorzystanie sieci neuronowych do syntezy mowy wyrażającej emocje
PublicationW niniejszym artykule przedstawiono analizę rozwiązań do rozpoznawania emocji opratych na mowie i możliwości ich wykprzystania w syntezie mowy z emocjami stosując do tego celu sieci neuronowe. Wskazano również przydatnośc parametrów typowo stosowanych do rozpoznawania mowy w detekcji emocji w śpiewie i rozróżnianiu tych emocji w obu przypadkach. Przedstawiono aktualne rozwiązania dotyczące rozpoznawania emocji w mowie i metod syntezy...
-
Emotions in the software development process
PublicationThis paper presents the results of a survey on the experience of emotions in the work of software developers. Quantitative analysis revealed information about emotions affecting programmers, their frequency and impact on their performance. The concept of emotional risk to productivity was also presented. It allows to choose the emotional states, which should be avoided. Furthermore, all collected data were analyzed with information...
-
FEEDB: A multimodal database of facial expressions and emotions
PublicationIn this paper a first version of a multimodal FEEDB database of facial expressions and emotions is presented. The database contains labeled RGB-D recordings of people expressing a specific set of expressions that have been recorded using Microsoft Kinect sensor. Such a database can be used for classifier training and testing in face recognition as well as in recognition of facial expressions and human emotions. Also initial experiences...
-
An unorthodox view on the problem of tracking facial expressions
PublicationRecent developments in imaging cameras has opened a new way of analyzing facial expression. We would like to take advantage from this new technology and present a method of imaging and processing images of human face as a response to the particular stimuli. The response in this case is represented by the facial expressions and the stimuli are still images representing six basic emotions according to Eckmann. Working hypothesis...
-
Identification of Emotions Based on Human Facial Expressions Using a Color-Space Approach
PublicationHCI technology improves human-computer interaction. Such communication can be carried out with the use of emotions that are visible on the human face since birth. In this paper the Emotion system for detecting and recognizing facial expressions, developed in the MSc work, is presented. The system recognizes emotion from webcam video in real time. It is based on color segmentation and morphological operations. The system uses a...
-
Neural oscillations induced by IAPS pictures (session 17)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 22)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 15)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 21)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 20)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 12)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 18)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 14)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 13)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 16)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 19)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
The set of 22 sessions of 14-channel eeg signals recorded during watching pictures
Open Research DataThe data were collected in order to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals collected from 10 persons within 22 sessions, during which 45 different random photos taken from the ImageNet...
-
Neural oscillations induced by IAPS pictures (session 10)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 1)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 9)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 5)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 2)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...
-
Neural oscillations induced by IAPS pictures (session 11)
Open Research DataThe data were collected to perform research on the possibility of controlling the content displayed on the monitor screen using human emotional states extracted from EEG signals. The dataset contains recordings of 14-channel EEG signals gathered from seven persons aged 23 - 35 within 26 sessions, during which 45 different random photos, taken from the...