From signals to emotions: Applying emotion models to HM interactions

Maria Rita Ciceri, Stefania Balzarotti

Risultato della ricerca: Contributo in libroChapter

Abstract

In this chapter we presented a dimensional semantic model (Multidimensional Emotional Appraisal Semantic Space, MEAS) which locates emotion in a four axes space, in the attempt to detect rules linking pattern of signals to underlying emotional axes, such as novelty, valence, coping and arousal. Although the development of this set of rules is still in progress, this model is aimed at providing hints in the work area of Affective Computing concerning emotion decoding, i.e. the implementing and design of automatic emotion recognizers. In particular, within this field of studies, we addressed the subtask of semantic attribution: once the machine is able to capture and to process the multimodal signals (and pattern of signals) exhibited by the human user during the interaction, how is it possible to attribute to them an emotional meaning (or in other words, to label them)? First of all, it is important to note that despite the use of the term «rule», the MEAS scoring system is not meant as a set of fixed and stable laws to be rigidly applied to every type of HM interaction and context. In fact, this would disclaim one of the principles on which the system is based (embodiment) and the more general conception of the HM interaction which is here proposed. Concerning the former – as previously explained – the MEAS system is thought as strictly linked to the context, that is to the type of running task and to the actions performed by the human user. Concerning the second, in our view, the machine should use the user’s emotional signals to be able to tune to his/her emotional state (process of attunement). Moreover, as clearly showed by our data, the users themselves show emotional responses which are highly influenced and congruent with the type of eliciting stimuli and the way they are appraised. Therefore, the MEAS rules are flexible and may change according to these contextual elements adjusting to them. Second, the MEAS system is designed to record the continuous modifications of emotional dimensions rather than the number of appearances of certain types of emotion categories, since it is based on a theoretical conception of emotion as a process rather than a state.
Lingua originaleEnglish
Titolo della pubblicazione ospiteAffective Computing. Emotion Modelling, Synthesis and Recognition
Pagine271-296
Numero di pagine26
Stato di pubblicazionePubblicato - 2008

Keywords

  • HM interaction
  • emotion
  • non VERBAL COMMUNICATION

Fingerprint

Entra nei temi di ricerca di 'From signals to emotions: Applying emotion models to HM interactions'. Insieme formano una fingerprint unica.

Cita questo