Human emotion interpreter: a proposed multi-dimension system for emotion recognition

Research output: Contribution to conferencePaper

Abstract

Human emotion interpretation contributes greatly in human-machine interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.
Original languageEnglish
Publication statusPublished - 22 Sep 2016
Externally publishedYes
EventSAI Intelligent Systems Conference 2016 - CentrEd at ExCel , London, United Kingdom
Duration: 21 Sep 201622 Sep 2016
http://saiconference.com/Conferences/IntelliSys2016

Conference

ConferenceSAI Intelligent Systems Conference 2016
Abbreviated titleIntelliSys 2016
CountryUnited Kingdom
CityLondon
Period21/09/1622/09/16
Internet address

Fingerprint

Schematic diagrams
Health care
Pattern recognition
Feature extraction
Classifiers
Education
Communication

Cite this

ElSayed, S. (2016). Human emotion interpreter: a proposed multi-dimension system for emotion recognition. Paper presented at SAI Intelligent Systems Conference 2016, London, United Kingdom.
ElSayed, Salma. / Human emotion interpreter : a proposed multi-dimension system for emotion recognition. Paper presented at SAI Intelligent Systems Conference 2016, London, United Kingdom.
@conference{7175d5c7eee64f778aabc522fcc25cb9,
title = "Human emotion interpreter: a proposed multi-dimension system for emotion recognition",
abstract = "Human emotion interpretation contributes greatly in human-machine interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.",
author = "Salma ElSayed",
year = "2016",
month = "9",
day = "22",
language = "English",
note = "SAI Intelligent Systems Conference 2016, IntelliSys 2016 ; Conference date: 21-09-2016 Through 22-09-2016",
url = "http://saiconference.com/Conferences/IntelliSys2016",

}

ElSayed, S 2016, 'Human emotion interpreter: a proposed multi-dimension system for emotion recognition' Paper presented at SAI Intelligent Systems Conference 2016, London, United Kingdom, 21/09/16 - 22/09/16, .

Human emotion interpreter : a proposed multi-dimension system for emotion recognition. / ElSayed, Salma.

2016. Paper presented at SAI Intelligent Systems Conference 2016, London, United Kingdom.

Research output: Contribution to conferencePaper

TY - CONF

T1 - Human emotion interpreter

T2 - a proposed multi-dimension system for emotion recognition

AU - ElSayed, Salma

PY - 2016/9/22

Y1 - 2016/9/22

N2 - Human emotion interpretation contributes greatly in human-machine interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.

AB - Human emotion interpretation contributes greatly in human-machine interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.

M3 - Paper

ER -

ElSayed S. Human emotion interpreter: a proposed multi-dimension system for emotion recognition. 2016. Paper presented at SAI Intelligent Systems Conference 2016, London, United Kingdom.