Title (eng)
Applied Emotion AI: Usage and Misuse
With an Example from Facial Expression Recognition in Video Conferences
Author
Description (eng)
Emotion AI or Affective Computing deals with the ability of machines to recognize human emotions. Our physical signals can be analyzed and categorized which makes it possible to train machines to recognize emotions or to simulate them. This changes how we interact with technology, and it could also change how we interact with each other. There are more and more research activities in this field as well as companies and products on the market that apply Emotion AI. According to a recent forecast, emotion detection and recognition is a rapidly growing market which will be worth more than 42 billion USD by 2027. In this paper, we give an overview of Emotion AI companies and applications with examples of its use as well of its misuse.
As technology becomes ubiquitous in interpersonal interactions and activities, Emotion AI could make our tool-based interactions more human-like. Emotions play a central role in our communication as well as in our decision-making and should therefore get more attention, even in business environments. Since Covid-19, more and more meetings are being held virtually which has advantages but also many disadvantages. For example, the transmission of non-verbal signals becomes more difficult and changes our interaction behavior. People also report exhaustion caused by the huge number of video conferences, the so-called Zoom-Fatigue phenomenon.
In our previous research, we did small-scale user studies with Facial Expression Recognition (FER) in video conferences with the goal to make emotions more visible in virtual environments. We compared the analysis results of the FER tool with the assessment of human observers and the participants themselves. The results obtained from the tool, human observers, and participants varied in certain situations. In this paper, we present a more advanced and human-centered approach, where a set of different physical signals in addition to facial expressions could be integrated, and where people could approve, reject, or stop the tool-based analysis. This is a semi-automated adaptive user interface which would decrease inaccuracies conducted by the tool and provide participants with greater control. Possible usage scenarios could be to make signs of emotions more visible, for example through enhanced facial expressions on an avatar or by depicting emotions using metaphors. This would give people the opportunity to transmit emotions more obvious during video conferences. For this purpose, we use our SitAdapt system, which is an integrated software system for enabling situation-aware real-time adaptations for web and mobile applications. It uses the different APIs of the devices such as eye-tracker, wristband, facial expression, and EEG signal recognition software, as well as metadata from the application to collect data about the user. The included rule editor allows the definition and modification of situation rules, e.g., for specifying the different user states and the resulting actions. The rule editor can use all input data types and attribute values as well as their temporal changes for formulating rule conditions. At the runtime of the application, the rules are triggered by the adaptation component for adapting the user interface, if the conditions of one or more rules apply.
In addition to presenting our approach with SitAdapt, we aim to give an overview of Emotion AI with its applications and examples of companies, products, and misuses since these are fast-growing technologies in a fast-growing market which raises important ethical questions.
Keywords (eng)
Emotion AIAffective ComputingEmotion RecognitionHuman-Centered AIBusiness ContextVideo ConferencesVirtual CollaborationAdaptive Systems
Subject (eng)
ÖFOS 2012 -- 5080 -- Media and Communication Sciences
Subject (eng)
ÖFOS 2012 -- 202022 -- Information technology
Subject (eng)
ÖFOS 2012 -- 202002 -- Audiovisual media
Subject (eng)
ÖFOS 2012 -- 6040 -- Arts
Type (eng)
Type (eng)
Language
[eng]
Persistent identifier
DOI
Publication
St. Pölten University of Applied Sciences , St. Pölten , 2024-11-27
Access rights (eng)
License
- Citable links
- Other links
https
//phaidra.fhstp.ac.at/o:7209 - Content
- RightsLicenseRights statementopen access
- DetailsUploaderResource typeText (PDF)Formatapplication/pdfCreated09.07.2025 11:38:54 UTC
- Usage statistics--
- This object is in collection
- Metadata
- Export formats