This file is http://www.cs.bham.ac.uk/~axs/misc/affect.and.hci.html
In Aaron Sloman's Web directory.

British HCI Group one-day meeting
in conjunction with University College London
AFFECTIVE COMPUTING: THE ROLE OF EMOTION IN HCI
Main Guest Speaker:
Rosalind Picard, MIT Media Labs.
Saturday 10th April 1999 The Lewis lecture theatre, the Windeyer Building
University College London

For latest details, full programme, travel information, etc. see the workshop web site.

SOME FURTHER DETAILS

Rosalind Picard's talk:

Title:
Toward Interfaces that Recognize and Respond to a User's Emotional Expression

Abstract:
Research in "affective computing" aims to give computers the skills of emotional intelligence, including the ability to recognize emotions, to appropriately express them, and to know how to interpret and respond to emotions expressed by other people, animals, and machines. In some cases machines (including agents, and other computational devices) will also have internal mechanisms of emotion. Hence, additional skills of emotional intelligence will be needed to regulate those emotions and harness their use.

In this presentation I will highlight our recent efforts to give computers the ability to recognize and respond intelligently to a user's emotional expressions. I will show wearable computers with customized pattern recognition software, including eyeglasses that communicate expressions such as confusion or interest, and a wearable "StartleCam." The StartleCam records pictures based on the orientation response of the wearer, and hence is a first step toward a system that reduces information overload by recognizing and responding to the wearer's physiological signals.

I will also describe new software that responds to frustrated users with a careful mix of empathy, sympathy, and other skills of emotional intelligence. This ``emotionally savvy" software significantly improved users' willingness to interact with the system, as measured in a behavioral study involving 70 subjects, two control conditions, and a frustrating computer scenario. Among other things, we found that if a user gets particularly frustrated, it may even be beneficial for the computer to apologize. In short, it appears that computers can not only do a good job of frustrating people, but they can also actively support people in reducing their frustration level.

Affective computing raises a number of potential concerns, together with potential benefits for fundamentally improving the nature of human-computer interaction. I will mention a few of these philosophical, social, and ethical issues, especially some that have already arisen in our experiments with "artificial empathy."

Rosalind W. Picard heads the Affective Computing Research Group at the MIT Media Laboratory, addressing computing that relates to, arises from, or deliberately influences emotions.

Aaron Sloman's talk:

Title:
Why can't a goldfish long for its mother? Architectural prerequisites for various types of emotions.

Abstract:
Our everyday attributions of emotions, moods, attitudes, desires, and other affective states implicitly presuppose that people are information processors. To long for something you need to know of its existence, its remoteness, and the possibility of being together again. Besides these semantic information states, longing also involves a control state. One who has deep longing for X does not merely occasionally think it would be wonderful to be with X. In deep longing thoughts are often uncontrollably drawn to X.

We need to understand the architectural underpinnings of control of attention, so that we can see how control can be lost. Having control requires being able to some extent to monitor one's thought processes, to evaluate them, and to redirect them. Only "to some extent" because both access and control are partial. We need to explain why. (In addition, self-evaluation can be misguided, e.g. after religious indoctrination!)

"Tertiary emotions" like deep longing are different from "primary" emotions (e.g. being startled or sexually aroused) and "secondary emotions" (e.g. being apprehensive or relieved) which, to some extent, we share with other animals. Can chimps, bonobos or human toddlers have tertiary emotions? To clarify the empirical questions and explain the phenomena we need a good model of the information processing architecture.

Conjecture: various modules in the human mind (perceptual, motor, and more central modules) all have architectural layers that evolved at different times and support different kinds of functionality, including reactive, deliberative and self-monitoring processes.

Different types of affect are related to the functioning of these different layers: e.g. primary emotions require only reactive layers, secondary emotions require deliberative layers (including "what if" reasoning mechanisms) and tertiary emotions (e.g. deep longing, humiliation, infatuation) involve additional self evaluation and self control mechanisms which evolved late and may be rare among animals.

An architecture-based framework can bring some order into the morass of studies of affect (e.g. myriad definitions of "emotion"). This will help us understand which kinds of emotions can arise in software agents that lack the reactive mechanisms required for controlling a physical body.

HCI Designers need to understand these issues (a) if they want to model human affective processes, (b) if they wish to design systems which engage fruitfully with human affective processes, (c) if they wish to produce teaching/training packages for would-be counsellors, psychotherapists, psychologists.

These ideas are developed in the Cognition and Affect project papers in ftp://ftp.cs.bham.ac.uk/pub/groups/cog_affect/0-INDEX.html

Last updated: 18 Mar 1999
Aaron Sloman