Recognition and Expression of Emotions by a Symbiotic Android Head

advertisement
Recognition and Expression of
Emotions by a Symbiotic
Android Head
Daniele Mazzei , Abolfazl Zaraki , Nicole Lazzeri and Danilo De Rossi
Presentation by: Kaixi Wu
“Social Robots”
• Humans are fascinated by robots that
can understand and express emotions
• Social Robots should be believable and
acceptable, so as not to betray the
expectations of the humans interacting
with them
• The FACE robot conveys emotion
through facial expressions
The FACE Humanoid Project
This paper examines the research conducted on the FACE humanoid
robot. The robot has sensors to perceive its outside world and follow
conversations, and various expressions and behaviours to react to
social cues.
The FACE Robot
The FACE Robot
• FACE: Facial Automaton for Conveying Emotions
• Built by David Hanson
• Skull is 3-D printed in ABS (Acylonitrile Butadiene Styrene, a plastic
material)
• Skin is made of Frubber (A skin-like silicone)
• Motor anchor points placed under skin, connected to metal cables
and servo motors to control motion
32 Total Servo Motors
Servo motors are actuators that
can precisely control rotational or
linear motion
• Face: 25 servo motors
• Neck: 4 servo motors
• Eyes: 3 servo motors
Facial Action Coding System
• By psychologists Paul Ekman and Wallace Friesen in 1978
• Action Units represent points of contraction or relaxation of one or
more muscles
• Distinct configurations for the six universally accepted emotions
(happiness, anger, sadness, disgust, fear, and surprise)
AU Configuration for FACE
Hybrid Engine for Facial Expressions Synthesis
• HEFES is a service of FACE that takes the basic emotion inputs
interpolates these on the “emotional plane”
• Allows for generation of more realistic and less stereotypical facial
expressions
Gaze Controlling System
• Human gaze is a strong non-verbal cue to a person’s mental and
emotional states
• Attention module selects most prominent target
• Gaze-control system continuously adjusts the head and eye
movements
• To mimic human motions, the eyes always start to move slightly before the
head does.
FACE Cognitive Architecture
• Based on Antonio Damasio’s Theory:
• Inputs from sensors are converted to knowledge structures
• Knowledge structures allow reasoning
• Reasoning processes result in internal or external actions and new generated
knowledge
• Actions and new knowledge drive emotions and behaviors
• CLIPS: an intuitive rule-based planning system that allows for quick
reactions
Experiment #1:
FACE Expressive Believability
Key Questions
• “Is a humanoid robot able to convey expressions as well as humans?”
• “Are there differences between facial expressions observed as 2D
photos, 3D models or performed by a physical humanoid robot?”
• “Studies investigating the recognition of different facial expressions
state that positive emotions are recognized faster and may be visually
simpler than negative facial expressions. Is this theory still valid with a
humanoid robot?”
• “Does the interpretation of humanoid robot expressions induce
different psychophysiological state in comparison with 2D photos and
3D models?”
The Experiment
• 15 subjects aged 19 − 31 years were recruited for the experiment.
• Stepwise Protocol: Subjects exposed to gradually more realistic
stimuli (2D pictures, 3D models, and the actual robot)
• 2D photos and 3D models were created for FACE robot and a human
face, for each of the 6 expressions
• Subjects were asked to recognize the emotional states
• The subjects’ psychophysiolgical signals were also analyzed for hints
of nervous system activity indicating challenging or strenuous tasks
The Results
• There is a better tendency to
recognize the physical robot’s
expressions than the 2D photos
and 3D models of the robot or
human
• Interpretation of facial expressions
of the FACE robot does not alter
the subjects’ psychophysiological
states differently from interpreting
those of human 2D photos and 3D
models
Experiment #2:
FACE Gaze and Tracking
The Experiment
• 11 subjects, aged 22-35, were recruited for the
experiment
• Showed participants videos of two-person social
interactions
• Tracked eye movements with professional eye
tracker
• Obtained gaze points with corresponding times
• Same videos were shown to FACE robot
• Compared human gaze behavior with FACE gaze
behavior
The Results
The FACE gaze
control system
was able to
replicate the
human gaze 89%
of the time.
Experiment #3:
FACE Behavioral Control
The Experiment
Preliminary tests
conducted in various social
scenes, to see the robot’s
reaction to social cues
Behavioral Model
• “If no subjects are present in the robot’s field of view, the robot is
annoyed and looks at the most salient point (in term of colours and
shapes) of the perceived scene”
• “If someone is present in the scene, the robot’s facial expression
becomes neutral and the robot starts to follow the most important
subject identified according to a ranking of the following social cues:
hand gesture, distance, speaking probability, facial expressions”
• “If someone invades the robot’s intimate space, it changes its facial
expression to dislike keeping the attention on the subject”
Demo Video
https://www.youtube.com/watch?v=-6FVZsaDLVg
Conclusion
& Future Developments
Conclusion: The Development Process
• Sensory apparatus to perceive and interpret social world and social
cues
• Robot facial expression generation system to convey emotion through
facial expressions
• Human-inspired gaze model to track subjects in a scene
• Hybrid control center that allows real-time control of robot behavior
and a user-friendly interface
Conclusion: Remaining Challenges
• Limitation of input sensor used
• Environmental noise
• Data communication and synchronicity
Conclusion: Future Developments
• Better perception through improved sensors:
• Wide FOV (Field of View) Vision Sensors
• Touch sensors
• A social behavioral interpretation system for better human-robot
interaction
• User assessment of robot behavior to see how natural the robot is
perceived by humans
• Integrating the FACE head with a robot body to show emotion
through gestures
Any questions?
Download