Emotion Modeling - Soft Computing Lab.

advertisement

1

Ch 11. Shallow and Inner forms of

Emotional Intelligence in Advisory

Dialog Simulation

소프트컴퓨팅 연구실

황금성

2

ABSTRACT

• A conversational agent

– Aspiring to be believable in an emotional domain

– Should be able to combine its rational and emotional intelligence

• We claim

– Cognitive model of emotion activation may contribute it

– By providing knowledge to be employed in modeling emotion

• We show

– XML markup language

– Insuring independence between the agent’s body and mind

– Adapting the dialog to the user characteristics

• Example domain

– Eating disorder domain

1. Introduction

• There is certainly a link between

– The type of character

– Their application domain to which it applies

– The category of users to which it is addressed

• Ex) Advisory dialogue (about eating habits)

– In the case of children ,

 Cartoon suggesting a correct behavior in some domain

 Fun illustration of the effects of healthy/unhealthy eating

– In the case of adults

 The psychological problems which go with eating disorders require a different form of believability

 Give the impression of being expert and trustworthy

Of understanding the reasons of the interlocutors

Of adapting to their needs

3

 We will focus our analysis on how emotions arising during the dialog might influence its dynamics

4

Research Of Believable Conversation

• Problem of simulating dialogs with human - by computational linguists

[Wilks

’99

]

Turing test already envisioned a computer dialog in the scope of a game - computer science [Turing

’50

]

• The ability to show clear personality traits

[Walker

’97

,Loyall

’97

,Castelfranchi

’98

]

• To recognize doubt [Carberry

’02

]

• To be polite [Ardissono

’99

]

• To introduce a bit of humor [Stock

’02

]

• To persuade with irrational arguments [Sillince

’91

]

• Showing consistency between inner and outward aspects of behavior

[Marsella

’03

]

• Human-like character rather than a cartoon in medical applications

[Marsella

’03

]

• Flexible behavior [Prendinger

’03

]

5

2. Advisory Dialogs

• Show an appropriate behavior

– By providing relevant information and suggestions

– By persuading them to follow them

• Dialogs may be emotional

– When the affective state of the user is influenced by information received

– When the expert reacts to the users’ answers by showing an empathic attitude

Asymmetry: expert 가 정보를 제공해야 하면서도 질문을 하므로

 Can be reduced by enabling users to drive information provision toward their needs

To behave believably

– Agent should show some form of emotional intelligence

 Recognizing and expressing emotions

 Regulating them

 Utilizing them to optimize the dialog

6

2. Advisory Dialogs (Cont.)

• We developed an emotion modeling method and tool

– How the agent reacts to the user’s moves

 Emotion triggering and decay

 Simulate a regulatory mechanism

– Our system

 Express emotions through face, gesture, and speech

 Most shallow form of emotional intelligence

• How the dialog is affected by the emotional state

– By some personality traits

– By their relationship [Pelachaud

’02

]

– Researches: emotions influence learning , decision making and memory [Picard

’97

]

7

2. Advisory Dialogs (Cont.)

• Problem issues in simulating affective dialogs

– Which personality factors may affect the dialog

– Which emotions the agent may feel

– How an emotion influences the course of dialog

– How an emotion affects the way that a particular goal is rendered

• 3 types of relationships

– Friendship: 친밀감

– Animosity: 적의

– Empathy: 감정이입

• Our advisory dialog simulator

– Domain-independent simulator the agent's mind

Automatic Tagging

MIDAS

Animation Engine

Festival + Greta or MS-Agent or other generates an agent's body

Emotion Modeling

Executable-Mind

GRAPHICAL

INTERFACE

Dialog Manager

TRINDI

8

3. Two Example Application Domains

3.1 Travel Agency

• To prove domain independence of our system

• Extension of the kind of dialogs that were simulated in Godis , with the introduction of some small talk in the style of REA [Bickmore

’99

]

Real-estate agent

• Agent plays the role of a travel agent

– Provide suggestions about a holiday

• Small talk

– Is triggered when the agent is initially set in the empathic mode

– Wants to establish a friendly relationship with the user

By adding some comments about climate, traffic, tourist, etc.

9

GoDiS

• An experimental dialogue system built using the toolkit TrindiKit

• Explores and implements issue-based dialogue management

– Adapt G inzburg

’ s K O S to di alogue s ystem (GoDiS) and implement

• Extends theory to more flexible dialogue

1. Menu based dialogue

– Action-oriented dialogue, VCR application

2. Multiple tasks, information sharing between tasks

3. Feedback and grounding

4. Accommodation, re-raising, clarification

5. Multi-linguality & mutiple domains

6. Conditional responses (Ivana Kruijff-Korbayova)

10

What is TrindiKit

• A toolkit for

– Building and experimenting with d ialogue m ove e ngines (DME) and systems,

– Based on the information state approach

• Not a dialogue system in itself

• What does TrindiKit provide

– High-level formalism and interpreter for implementing dialogue systems

 Promotes transparency, reusability, plug-and-play, etc.

 Allows implementation and comparison of dialogue theories

 Hides low-level software engineering issues

– GUI, WWW-demo

– Ready-made modules and resources

 Speech

 Interfaces to databases, devices, etc.

 Reasoning, planning

11

• Fig. 2. An example dialog in the Travel

Agency domain

3. Two Example Application Domains

12

3. Two Example Application Domains

3.2 Advice About Eating Disorders

• Eating disorders are common among young people

• Chat line or web sites

– Tends to be empathic, but not tailored to the user

– User cannot ask questions, make objections, request clarifications, take the initiative to drive information provision

– Cannot recognize whether and how the emotional state of the user changes during interaction

– Emotions are static and rigid

• Aims of our agent

– Enable young users to interact with our agent

– Establish an atmosphere of empathic relationship between the two interlocutors

– Make the advice provided more effective

About Eating Disorders

Example Dialog Produced By Our System

13

G0: Hi, my name is Greta! I’m here to help you find a healthier way of feeding yourself. What’s your name?

U0: My name is Jana.

G1: What’s your problem, Jana?

U1: I’ve been eating too much for about one year and I realize that I’m not myself, that I’m not thinking quite right!

G2: Oh I’m sorry . That’s a bad situation indeed. What is your age ?

U2: I’m 17 .

G3: Well, teenagers are notorious for being hungry frequently, because that need to eat more to support the major growth of their bodies.

U3: Oh no, it’s not a question of increased appetite or overeating from time to time: it’s a question of compulsive overeating, in my case. I feel out of control and powerless to stop!

G4: How awful! Have you already tried to do something to solve this problem?

U4: Yes, and I didn’t succeed. I now fear that stopping will not be possible for me! So, I don’t know why I’m here!

G5: Oh no! You have to be confident in yourself! If you want to get over this, I can help you.

14

About Eating Disorders

Example Dialog Produced By Our System (Cont.)

U5: What can I do?

G6: I have some ideas and experience on how to make a smooth, quick transition from overeating back to healthy living. If you wish, I can suggest what you might do.

U6: Yes, please, I want to make every possible effort to get out of this!

G7: Fine! This is the attitude I want to see from you! So, let’s start, first of all, from cleansing your home of all “binge foods”. Whatever your current food downfalls are, get rid of them!

U7: But I can’t stand to throw food away!

G8: Well, give it to someone, then! And if your relatives insist on keeping such foods in the house, ask them to keep it away from you for a while.

U8: That’s a good suggestion, I’ll try! What else?

G9: I need to know something more about the way you live, to go on. Do you take part in any kind of sport, for instance?

U9: Nothing, for a long time.

G10: So, plan to exercise! Choose something you enjoyed doing, even though you may not feel like doing it right now.

. . .

15

4. Emotion Modeling

• Algorithmic aspects of the model and prototype implementation

• In this modeling method, particular attention is paid to

– How emotions change in intensity with time

– How they are mixed up

– How each of them prevails

– In a given situation, according to the agent’s personality

– To the social context in which the dialog occurs

• We focused our attention on eventdriven emotions in Qrtony et al.’s theory

– Which includes positive and negative emotions

– Triggered by present or future desirable or undesirable events

We adopted Oatley and Johnson-

Laird’s theory

– Positive and negative emotions are activated by the belief that some goal is achieved or threatened as a consequence of some event

4. Emotion Modeling (Cont.)

16

The cognitive model of emotions

– Represent the system of beliefs and goals behind emotion activation

Has the ability to guess the reason why it feels a particular emotion

– Has the ability to justify it if needed

Shows how the agent’s system of goals is revised as a consequence of feeling emotion

– Shows how this revision influences the dialog dynamics

We apply a Dynamic Belief Network (DBN)

– As a goal monitoring method

Employs observational data in the time interval (Ti, Ti+1)

– To generate a probabilistic model of the agent’s mind

– Reason about the consequences of the observed event on the monitored goals.

• We calculate the intensity of emotions as a function of the uncertainty

Of the agent’s beliefs that its goal will be achieved

– Of the utility assigned to achieving this goal

• We combined the variables

To measure the variation in the intensity of an emotion

4. Emotion Modeling

A Portion Of The DBN: Triggering Of “Sorry-for”

Bel(FeintdOf G U)-Ti BelG(FrinedOf G U) not(Desirable E)

(Occurs E U)

Say U not(Desirable E)

Event-BN in (Ti,Ti+1)

Say U(Occ E U)

BelG not(Desirable E)

BelG(Occ E U)

BelG GoalU not(Occ E U)

Time Ti+1

GoalG not(Occ E U)

Time Ti Interval(Ti,Ti+1) Mind-BN at Ti+1

BelG(UnsatisfFor G U E)

BelG(Thr-GoodOf U)-Ti+1

BelG(Thr-GoodOf U)-Ti

Emotion-BN at Ti+1

Feels G(SorryFor U)

17

18

4. Emotion Modeling

An Example: Slow And Fast Decay Of Emotions

4. Emotion Modeling

4.1 Two Versions Of Our Emotion Simulator

19

• An emotion simulation tool Applies in two different versions

1. Mind-Testbed

– Create and test models

– Supported files

 the agent’s mind : Mind-BN

 the events that may occur in every considered domain: Event-

BNs

 the relationships between goals in Mind-BN and emotions modeled: Emotion-BNs

 The personalities the agent may take

 The contexts in which simulation may occur

2. Executable-Mind

– When the calling program inputs a user’s move

 analyzes the acts

 activate emotions in the agent

 updates the emotion intensity table with the new entry

 sends it back to the calling program

– User’s move: combination of communicative acts

20

4. Emotion Modeling

The Graphical Interface Of Mind-testbed

21

4. Emotion Modeling

4.2 A Simulation Example

• Fig. 6. Emotions triggered in the example dialog, with four personalities agent

5. Regulation And Display Of Emotions

• To regulate and display its emotions

• An emotion E may be hidden or displayed

– This “decision” may be influenced by

 Personality factors

 The interaction context

• The emotional behavior is modeled by means of rules that regulate activation of display goals [Carolis

’02

]

• For example,

– This rule activates the goal of hiding fear felt at time T

5 because the agent has an adoptive relationship with the user

22

– This rule activates the goal of showing, at move G7 (page 17), the hope felt at time T

7

23

Emotion Display

• When an emotion has to be displayed

– An affective tag is automatically added to the agent’s move

Fine! This is the attitude I want to see from you! So, let’s start, first of all, from cleansing your home of all binge foods. Whatever your current food downfalls are, get rid of them!

24

Character’s Action Rules

• To support interfacing with cartoon-like characters (MS-Agents)

– We define the meaning-signal correspondence of a character

– In an XML Meaning-Signal translation file

 The rule of the form

• Some examples with the MS-Agent Ozzar

An animation or speech feature

25

6. Dialog Simulation

• Emotions have to be utilized

– To drive reasoning

– To regulate it

• Simulating affective dialogs requires

– Modeling how emotional states influence the course of dialog

 Priority of communicative goals

 Dialog plans

 Surface realization of communicative acts

• Dialog manager has to solve

– How should the agent behave ?

 after discovering the emotional state of the user

 after feeling an emotional state of its own

– How should these emotions affect the dialog dynamics?

26

6. Dialog Simulation (Cont.)

• The idea is

– Agent has an initial list of goals

 That she aims to achieve during the dialog

 With its own priority

 Some of these goals are inactive

– The agent knows

 How every goal may be achieved in a given context

27

6.1 Agent And User Models

• An agent and a user model are stored

– With the interaction history

– In the information state of the dialog manager

• These models include two categories of factors

– Long term settings

 Agent’s personality , its role , relationship with the user

 Stable during the dialog

 Influence the initial priorities of goals

 Plan initiative handling, and behavior

– Short-term settings

 Beliefs and emotional state of the agent

 Evolve during the dialog and influence goal priority change and plan evolution

28

6.1 Agent And User Models (Cont.)

• The agent’s goal g i can be linked by one of the following relations

• Priority

– g i

< g j

: g i is more important than g j

, g i will be achieved before g j

.

• Hierarchy

– H ( g i

, ( g i

, ( g i 1

, g i 2

, … , g in

, ))

– The complex goal gi may be decomposed into simpler subgoals g i 1 g i 2

, … , g in

, which contribute to achieve it

,

• Causal relation

Cause( g i

, g j

), executing the plan achieving the source goal g i precondition for executing the plan achieving the destination is a

29

6.2 Plans

• Our dialog manager does not include a planner

• Plans are represented as recipes that the agent can use to achieve its goals

• Our agent adopts the typical planning sequence of advisory systems

– Situation-assessment

– Describe-eating-disorders

– Suggest-solution

– Persuade-to follow suggestion

• Default plan is outlined in the next page

30

The Discourse Plan In The Eating Disorders Domain

6.3 Reaction Rules

31

• In the case of urgent events

– Reduce the detail of information

– Upgrade the priority of “ most relevant ” subgoals

– Downgrade the priority of details

• When feeling altruistic social emotions

– Display them by verbal and non-verbal means

– Give them the highest priority

– Downgrade the priority of other goals

– Hide egoistic social emotions

• When feeling positive emotions

– Express them with non-verbal means

– Leave the priority of other goals unvaried

• When feeling negative emotions

– Activate behavior control goals

– Avoid displaying any emotional reaction by activating

Repair goals

32

Effects Of Reaction Rules

• Reaction rules may produce the following effects on the dynamics of plan activation

Add details

– Reduce details

– Abandon a plan temporarily and activate a new subplan

– Abandon a subplan

– Substitute a generic subplan with a more specific and situationadapted one

– Revise the sequencing of plans, to respond to the user request of

“taking the initiative”

33

7. Module Integration

• Graphical interface

– Interacts with the user and activates the modules

– Enables user to follow the dialog both in natural language and with the selected embodied agent

– Shows the agent’s emotional situation in graphical form

– Several agents have been linked to the system

 Various versions of Greta [Pelachaoud

’02

]

 Some MS-Agents

• Users may set the simulation conditions

– Agent’s personality

– Its relationship with the user

– Character’s body

– Application domain

• The dialog manager: TRINDIKIT

• Emotion triggering module: HUGIN API

34

The Graphical Interface Of Our Emotional

Dialog Simulator

• Two characters are displayed in the right frame

– Greta [Pelachaoud

’02

] and Ozzar (an MS-Agent)

35

The Information Exchanges Between Modules

• Executable-Mind

– Receives information about the setting conditions

– Selects personality , context , and domain files

– Receives interpreted user moves

– Sends back a list of emotion intensities

• Trindi

– Receives an interpreted user input and a list of activated emotions

– Generates an agent’s move which is displayed in natural language in the left frame

• Midas

– Produces an APML file

Animation engine

– Receives as input an APML file

– Using the meaning-signal translation file, animates the selected character

8. Conclusions And Future Works

36

• Our reaction rules are similar to social rules in Jack and Steve

• Our personality traits enable the representation of a larger variety of situations than McCrae

• Assumption behind our emotion activation method is the same as in

Emile, although the formalism is not the same

• The main limit of our prototype is in the asymmetry of the dialog modality

– Not natural

– Need a refined speech recognizer that detect emotional states of the users

Left questions

– Should multiple emotions be summed up into overall emotional states?

– Should they be stored and treated separately ?

– Should an agent always behave like a human ?

– Should it be planned to dominate its emotions in a larger number of circumstances ?

– Are emotions always useful in producing appropriate decision making?

Download