Human emotion interpreter: a proposed multi-dimension system for emotion recognition

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Human emotion interpretation contributes greatly in Human-Machine Interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.
Original languageEnglish
Title of host publicationProceedings of SAI Intelligent Systems Conference (IntelliSys) 2016
EditorsYaxin Bi, Supriya Kapoor, Rahul Bhatia
PublisherSpringer
Pages922-931
Number of pages10
Volume1
ISBN (Electronic)9783319569949
ISBN (Print)9783319569932
DOIs
Publication statusPublished - 2018
Externally publishedYes
EventSAI Intelligent Systems Conference 2016 - CentrEd at ExCel , London, United Kingdom
Duration: 21 Sep 201622 Sep 2016
http://saiconference.com/Conferences/IntelliSys2016

Publication series

NameLecture Notes in Networks and Systems
PublisherSringer
Volume15
ISSN (Print)2367-3370
ISSN (Electronic)2367-3389

Conference

ConferenceSAI Intelligent Systems Conference 2016
Abbreviated titleIntelliSys 2016
CountryUnited Kingdom
CityLondon
Period21/09/1622/09/16
Internet address

Fingerprint

Schematic diagrams
Health care
Pattern recognition
Feature extraction
Classifiers
Education
Communication

Cite this

ElSayed, S. (2018). Human emotion interpreter: a proposed multi-dimension system for emotion recognition. In Y. Bi, S. Kapoor, & R. Bhatia (Eds.), Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016 (Vol. 1, pp. 922-931). (Lecture Notes in Networks and Systems; Vol. 15). Springer. https://doi.org/10.1007/978-3-319-56994-9
ElSayed, Salma. / Human emotion interpreter : a proposed multi-dimension system for emotion recognition. Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016. editor / Yaxin Bi ; Supriya Kapoor ; Rahul Bhatia. Vol. 1 Springer, 2018. pp. 922-931 (Lecture Notes in Networks and Systems).
@inproceedings{9cfa1252ab034af9a04f9701c07122e4,
title = "Human emotion interpreter: a proposed multi-dimension system for emotion recognition",
abstract = "Human emotion interpretation contributes greatly in Human-Machine Interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.",
author = "Salma ElSayed",
year = "2018",
doi = "10.1007/978-3-319-56994-9",
language = "English",
isbn = "9783319569932",
volume = "1",
series = "Lecture Notes in Networks and Systems",
publisher = "Springer",
pages = "922--931",
editor = "Bi, {Yaxin } and Supriya Kapoor and Rahul Bhatia",
booktitle = "Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016",

}

ElSayed, S 2018, Human emotion interpreter: a proposed multi-dimension system for emotion recognition. in Y Bi, S Kapoor & R Bhatia (eds), Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016. vol. 1, Lecture Notes in Networks and Systems, vol. 15, Springer, pp. 922-931, SAI Intelligent Systems Conference 2016, London, United Kingdom, 21/09/16. https://doi.org/10.1007/978-3-319-56994-9

Human emotion interpreter : a proposed multi-dimension system for emotion recognition. / ElSayed, Salma.

Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016. ed. / Yaxin Bi; Supriya Kapoor; Rahul Bhatia. Vol. 1 Springer, 2018. p. 922-931 (Lecture Notes in Networks and Systems; Vol. 15).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Human emotion interpreter

T2 - a proposed multi-dimension system for emotion recognition

AU - ElSayed, Salma

PY - 2018

Y1 - 2018

N2 - Human emotion interpretation contributes greatly in Human-Machine Interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.

AB - Human emotion interpretation contributes greatly in Human-Machine Interface (HMI) spanning applications in health care, education, and entertainment. Affective interactions can have the most influence when emotional recognition is available to both human and computers. However, developing robust emotion recognizers is a challenging task in terms of modality, feature selection, and classifier and database design. Most leading research uses facial features, yet verbal communication is also fundamental for sensing affective state especially when visual information is occluded or unavailable. Recent work deploys audiovisual data in bi-modal emotion recognizers. Adding more information e.g. gesture analysis, event/scene understanding, and speaker identification, helps increase recognition accuracy. As classification of human emotions can be considered a multi-modal pattern recognition problem, in this paper, we propose the schematics of a multi-dimension system for automatic human emotion recognition.

U2 - 10.1007/978-3-319-56994-9

DO - 10.1007/978-3-319-56994-9

M3 - Conference contribution

SN - 9783319569932

VL - 1

T3 - Lecture Notes in Networks and Systems

SP - 922

EP - 931

BT - Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016

A2 - Bi, Yaxin

A2 - Kapoor, Supriya

A2 - Bhatia, Rahul

PB - Springer

ER -

ElSayed S. Human emotion interpreter: a proposed multi-dimension system for emotion recognition. In Bi Y, Kapoor S, Bhatia R, editors, Proceedings of SAI Intelligent Systems Conference (IntelliSys) 2016. Vol. 1. Springer. 2018. p. 922-931. (Lecture Notes in Networks and Systems). https://doi.org/10.1007/978-3-319-56994-9