Affective Computing

Also known as

  • Emotional computing
  • Emotion-oriented computing

History

Research into affect can be traced to the 19th century. However, it used to be linked to living organisms. Interest in the computational aspects of affects (or emotions, the terms seem to be used interchangeably) has only developed in the last decade of the 20th century, mostly due to the increased capacities of computers that can now be used to model emotions. “Picard’s ‘Affective Computing’ in 1997 [1] established the field, and several EC projects made substantial contributions (NECA, ERMIS, SAFIRA, etc). The EC Network of Excellence HUMAINE now brings together many of the most active teams” (Cowie, 2005, p.1).

Application Areas/Examples

Affective computing can play a role in any application area where human and computers interact, i.e. in almost any application area imaginable. Interestingly, there seems to be little agreement in the affective computing community on what the eventual applications of this technology will be or should be. There seems to be no “killer app” that will force a general breakthrough of the technology. There are a number of examples that can be found in the literature such as:

Emotional kitty (emotions in decision making)

BCI for profiling (data collection for affective computing)

Wearable brain imaging tool (detection of emotions)

Sensor systems (detection of emotions)

Robinson & el Kaliouby (2009) discuss a number of application areas. These include:

Communication with humans with difficulties in their affective system (e.g. autism spectrum disorder, anxiety and others)

Increase social-emotional intelligence in agents and robots

Facilitate real-world testing of models and theories of social cognition

Tao & Tan (2005) give a description of the main research groups in the area and their work, which covers a number of application examples. These include:

Incorporation of emotions into decision making

Creation and presentation of interactive drama

Emotion, Stress and Coping in Families with Adolescents: Assessing Personality Factors and Situational Aspects in an Experimental Computer Game

Research on architectures accounting for mental states

Social interaction with service robots

More detailed examples that lend themselves to an ethical analysis are:

Kitchen Robot[1]

“Imagine your robot entering the kitchen as you prepare breakfast for guests. The robot looks happy to see you and greets you with a cheery “Good morning.” You mumble something it does not understand. It notices your face, vocal tone, smoke above the stove, and your slamming of a pot into the sink, and infers that you do not appear to be having a good morning. Immediately, it adjusts its internal state to “subdued,” which has the effect of lowering its vocal pitch and amplitude settings, eliminating cheery behavioral displays, and suppressing unnecessary conversation. Suppose you exclaim unprintable curses that are out of character for you, yank your hand from the hot stove, rush to run your fingers under cold water, and mutter “… ruined the sauce.” While the robot’s speech recognition may not have high confidence that it accurately recognized all of your words, its assessment of your affect and actions indicates a high probability that you are upset and possibly hurt. At this moment it might turn its head with a look of concern, search its empathetic phrases and select, “Burn-Ouch … Are you OK?” and wait to see if you are, before selecting the semantically closest helpful response, “Shall I list sauce recipes?” As it goes about its work helping you, it watches for signs of your affective state changing – positive or negative. It may modify an internal model of you, representing what is likely to elicit displays such as pleasure or displeasure from you, and it may later try to generalize this model to other people, and to development of common sense about people’s affective states. It looks for things it does that are associated with improvements in the positive nature of your state, as well as things that might frustrate or annoy you. As it finds these, it also updates its own internal learning system, indicating which of its behaviors you prefer, so it becomes more effective in how it works with you.”

Affective Gaming

Games could take into account the players’ emotional state. Such technology could cover the following (Sykes & Brown, 2003):

- The ability to generate game content dynamically with respect to the affective state of the player.

Knowledge of the player’s affective state allows the game to deliver content at the most appropriate moment. For example, in ‘horror’ based games, the optimum effect of a loud noise will only occur if produced when the player is incredibly tense.

-The ability to communicate the affective state of the game player to third parties.

With the advent of on-line gaming it is more frequently the case that the player’s opponent is not physically present. However, it is often the emotional involvement of other players that shapes our enjoyment of a game. Affective Gaming technology can address this issue by having the on-screen persona reflect the player’s emotional state.

- The adoption of new game mechanics based on the affective state of the player.

An example of affective game mechanics can be found in Zen Warriors, a fighting game where, to perform their finishing move, the player has to switch from fast paced aggression, to a Zen-like state of inner calm.”

Special controllers could allow the expression of emotions, e.g. controllers in the form of a toy. Haptic controllers could give emotion-related feedback. The character in the game could represent the emotions of the human player[2]. For more on Affective gaming, see also Hudlicka (2009).

Affective Mediation

There is a range of impairments and disorders that may involve affective communication deficits. Garay et al. (2006) state that “Within assistive technology, the Augmentative and Alternative Communication (AAC) research field is used to describe additional ways of helping people who find it hard to communicate by speech or writing. AAC helps them to communicate more easily.” (p. 55).

Making reference to Picard (1997), Garay et al (2006) further indicate that “Within affective computing, affective mediation uses a computer-based system as an intermediary among the communication of certain people, reflecting the mood the interlocutors may have. Affective mediation tries to minimize the filtering of affective information of communication devices; filtering results from the fact that communication devices are usually devoted to transmission of verbal information and miss nonverbal information.” (p. 58).

“Gestele [name of the system] was developed with the aim of assisting people with severe motor and oral communication disabilities. It is a mediation system that includes emotional expressivity hints through voice intonation and preconfigured small talk. The Gestele system is used for telephone communication […]. Gestele allows for different input options (directly or by scanning, depending on user characteristics) and makes holding telephone conversations involving affective characteristics possible for people with no speaking ability.” (Garay et al. 2006, p. 67).

Emotional Human-Machine Interaction

Human computer interaction is a well-established field of research and practice. The interaction between humans and machines could benefit immensely if emotional states could be measured and displayed (Balomenos et al.,2004; Caridakis et al, 2006). Facial expression and hand gesture analysis, for example, could play a fundamental part in emotionally rich man-machine interaction (MMI) systems. Non-verbal cues can be used to estimate the users’ emotional state. This would allow systems to adapt their functionality depending on the emotional state of users, such as anger or frustration. The adaptability of the system will enhance usability and make interactions more attractive to users not accustomed with conventional interfaces. This might be implemented, for example, in automated telephone answering systems which react not only to the content of the spoken words but also to emotional cues. Users displaying high levels of frustration or anger might be connected to human operators.

Affective Computing in Tele-home Health

Lisetti & LeRouge (2004) describe the design of an intelligent interface (MOUE) which is aimed at discerning emotional states from processing sensory modalities (or modes) input via various media and building (or encoding) a model of the user’s emotions. This interface is contextualised within the tele-home health care setting to provide the health care provider with an easy-to-use and useful assessment of the patient’s emotional state in order to facilitate patient care. The propose to investigate how emotional state assessments influence responses from health care professionals and how MOUE can be accepted into the health care environment.

Definition and Defining Features

Definition

The definition of affective computing arising from the analysis so far is that:

Affective computing refers to the use of ICT for the purposes of perceiving, interpreting or expressing emotions or other affective phenomena and the simulation or realisation of emotional cognition. This can be done using a range of different modes (e.g. speech, posture, facial expression) and for a range of purposes and applications.

Definitions of affective computing in the literature generally support the defining features identified below. A broad definition is offered by the MIT research group: “Affective Computing is computing that relates to, arises from, or deliberately influences emotion or other affective phenomena.” (MIT Media Lab). Building on this definition, Roddy Cowie (coordinator of the HUMAINE network of excellence) offers a definition that indicates why affective computing is perceived to be an important field: “Data suggest that less than 10% of human life is completely unemotional. The rest involves emotion of some sort [...], but probably less than a quarter consists of archetypal ‘fullblown emotions’ (brief episodes dominated by strong feelings). Most of it is coloured by ‘pervasive emotion’ (feelings, expressive behaviours, evaluations, and so on, that are inherent in most human activities). ‘Emotion-oriented computing’ is about technology that takes account of both types.” (Cowie 2005, p.1). Affective computing is trying to assign computers the human-like capabilities of observation, interpretation and generation of affect features. It is an important topic for the harmonious human-computer interaction, by increasing the quality of human-computer communication and improving the intelligence of the computer.” (Tao & Tan 2005, p. 981).

Defining Features

Emotion pervades human interaction. Sensitivity to emotions is fundamental to any communication. Affective computing can change the way humans interact with the world is through:

Changes in human-computer interaction

Changes to technically mediated interaction between humans.

The application examples have shown that affective computing is complex because emotions themselves are complex. The examples support Cowie’s (2005) definition of affective computing as covering: “affect (feelings and physical changes associated with them); cognition and conation (shifts in perception, judgment, and selection of action); interaction (registering other people’s emotional status and conveying emotion to them); personality; culture; ethics; and much more”.

Defining features of affective computing are:

Perceiving emotions / affects

(i.e. sensing of physiological correlates of affect). The principle here is that humans send signals that allow others to infer the agent’s emotional state. Such emotional cues may be with the agent’s control, such as tone of voice, posture or facial expression. They may also be outside the agent’s control as in the case of facial colour (blushing), heart rate or breathing. Humans are normally aware of such emotional cues and react easily to them. Most current technology does not react to such emotional cues even though they are part of the illocutionary and perlocutionary function of a speech act.

Expressive behaviour by computers / artificial agents

Whereas the previous point aimed at the recognition of emotions by computers, this point refers to the expression of computers in ways that can be interpreted as displaying emotion. Affective expressions can render interaction more natural and thereby easier for users. It can be employed for persuasive purposes as it increases the likelihood of acceptance of computer-generated content.

Emotional cognition

(agent’s empathy, understanding of emotional states). In humans, affects are closely related, and arguably inseparable from cognition. Emotional cognition is the next logical step after recognition of emotions. It includes absorbing emotion-related information, modelling users’ emotional state, applying and maintaining a user affect model and integrating this model with the emotion recognition and expression. These activities offer the potential of personalising individual affect models and catering to individual preferences.

These three main functions of affective computing can refer to different modes of expression. Modes are ways in which emotion is signalled. Humans signal their emotional states in a number of ways.

Affective computing concentrates on the following modes for all of the different streams of activities:

Emotional speech processing

The measurement of acoustic features to assess emotions is used for the perception of emotions. Emotional expression can similarly make use of known features of speech, so that computer-generated speech can transport emotional messages (e.g. impatience, sadness).

Facial expression

Human facial expressions can be measured to infer emotional states. Facial animation (e.g of software agents on screens or on robots, famously KISMET: can be used to signal emotional states[3].

Body gesture and movement

Similar to facial expression, the entire body can be used to infer emotions or express them.

Multi-modal systems

These combine two or more of the above modes. For a stronger detection, modelling and display of emotions, multi-modality is desirable.

A final way in which affective computing may have profound influence on humans’ interaction with the world has to do with its theoretical underpinnings. Driven by research interests in affective computing, theories from psychology are developed. The interest in modelling emotion has implications for the theoretical description and, as a consequence, for the way we perceive human emotion in general.

Assumptions

All examples and defining features of affective computing are based on the assumption that human emotions are capable of being measured, recognised, and classified. An important aspect is that there is a direct link between emotions and external expressions.

Time Line

Ongoing. There are few predictions on particular breakthroughs or applications. Affect measuring technologies are described as “becoming robust and reliable” (Robinson & el Kaliouby, 2009, p. 3442).

Relation to other Technologies

Linked to Artificial Intelligence (AI) (models human property, close link to cognition).
Affects and emotions are part of the human cognitive process. While AI traditionally has concentrated on formal, expressive and propositional aspects of human intelligence, one can argue that a majority of human action is emotion-related. If AI is about perceiving, understanding, simulating and expression of human intelligent actions, then the incorporation of emotion should be an integral part of AI.

Linked to Ambient Intelligence (AmI) (ubiquitous sensors, pervasive presence).
AmI could play an important part in perceiving as well as expressing human emotions. Ubiquitous sensor networks or other ambient interaction devices could provide rich data to draw inferences on humans’ emotions. An ambient intelligent environment could react well to these, for example by modifying environmental aspects (e.g. lighting, sound, connectivity) according to users’ emotional states.

Critical Issues

No critical issues in the database. Goldie et al (2007) list the following ethical issues of affective computing:

Lack of conceptual clarity. Unlike other technologies, affective computing is based on concepts that are not well defined even in the research community.

Encroachment on humanity. Emotions are currently still confined to living beings, in particular humans. Can machines “have” emotions? What would that mean?

Reflexivity of ethics and affective computing. Affective computing raises emotions, ethical judgments on them contain emotions.

Instrumental use of emotions, for example to persuade individuals to consume.

Ethics-related legal issues, liability etc.

References

Academic publications

Balomenos, T., Raouzaiou, A., Ioannou, S., Drosopoulos, A., Karpouzis, K., & Kollias, S. (2004). Emotion analysis in man-machine interaction systems. Machine Learning for Multimodal Interaction, Lecture Notes in Computer Science, 3361, 318–328

Caridakis, G., Malatesta, L., Kessous, L., Amir, N., Raouzaiou, A., & Karpouzis, K. (2006). Modeling naturalistic affective states via facial and vocal expressions recognition. In Proceedings of the 8th international conference on Multimodal interfaces. p. 154

Garay, N. et al. (2006): Assistive technology and affective mediation. In Human Technology Vol 2:1, pp. 55-83. Retrieved on 30 March 2010, from http://www.humantechnology.jyu.fi/articles/volume2/number1/2006/humantechnology-april-2006.pdf

Hudlicka, E. (2009). Affective game engines: motivation and requirements. In Proceedings of the 4th International Conference on Foundations of Digital Games (pp. 299-306). Orlando, Florida: ACM. doi:10.1145/1536513.1536565

Lisetti, C., & LeRouge, C. (2004). Affective computing in tele-home health. In Proceedings of the 37th Hawaii International Conference on System Sciences-2004 (pp. 1–8). Retrieved on 30 March 2010, from https://www.eurecom.fr/util/publidownload.fr.htm?id=1803

Picard, R. W. (1997). Affective Computing. Cambridge, Mass: MIT Press

Picard, R.W (2007): Affective Computing. Entry in Scholarpedia. Retrieved on 30 March 2010 from http://www.scholarpedia.org/article/Affective_computing

Robinson, P., & el Kaliouby, R. (2009). Computation of emotions in man and machines. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3441-3447. doi:10.1098/rstb.2009.0198

Sykes, J., & Brown, S. (2003). Affective gaming: measuring emotion through the gamepad. In CHI ’03 extended abstracts on Human factors in computing systems (pp. 732-733). Ft. Lauderdale, Florida, USA: ACM. doi:10.1145/765891.765957

Tao, J., & Tan, T. (2005). Affective computing: A review. In Lecture Notes in Computer Science, Vol. 3784, pp. 981-995. Springer

Tao, J., Tan, T., & Picard, R. (2005). Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science. Berlin: Springer. Retrieved on 30 March 2010, from http://dx.doi.org/10.1007/11573548

Associations / research centres

HUMAINE network http://emotion-research.net/:

Cowie, R. (2005). Emotion-Oriented Computing: State of the Art and Key Challenges. HUMAINE White Paper. Retrieved on 30 March 2010, from http://emotion-research.net/projects/humaine/aboutHUMAINE/HUMAINE%20white%20paper.pdf

Goldie, P., Cowie, P., Doring, S., Sneddon, I. & Carstens, B. (2007) White paper on ethical issues in emotion-oriented computing. Retrieved on 30 March 2010, from http://emotion-research.net/projects/humaine/deliverables/D10d%20final.pdf

MIT Affective Computing centre: http://affect.media.mit.edu/


[1] Picard’s 2007 Sholarpedia entry at http://www.scholarpedia.org/article/Affective_computing

[2] See the following sites on Affective gaming:

http://www.youtube.com/watch?v=XEuy7Zpo4IE (accessed 30.03.2010)

http://www.sics.se/gametheme/media/sentoy-fantasya.mpg (accessed 30.03.2010)

[3]http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html accessed 30.03.2010

http://moriarty.tech.dmu.ac.uk:8080/index.jsp?page=628608