Making digital agents persuasive: the ethics of using similarity.

advertisement
Jilles Smids
Eindhoven University of Technology
Postbus 513, 5699 MB Eindhoven
Den Dolech 2, Eindhoven
IPO 1.10
T +31 40 2475174
J.Smids@tue.nl
Making digital agents persuasive: the ethics of using similarity.
Digital agents can be made persuasive by tapping into the sources of social influence as it
manifests itself between humans. We can call such agents persuasive technology (PT), technology
which is intentionally designed to change behavior, attitude or both (without using coercion or
deception; persuasion implies voluntary change) (Fogg, 2003, pp. 1, 15,16). An important source of
social influence is similarity. For example, humans automatically mimic each other’s posture, speech
patterns, and facial expressions, and usually are unaware of it. Most times, mimicry is an automatic
and unconscious process, which leads to a higher liking and trust and facilitates human interaction
(Maddux, W. W. et al., 2008). This automatic and non-reflective character of the underlyging
psychological processes is typical for other types of similarity as well.
Therefore, making digital agents similar to human users in one way or another to
influence them raises the worry of manipulation (understood as the intentional distorting or
bypassing of a person’s reflection and decision-making in order to maneuver that person). In this
paper I will investigate how we should evaluate the use of similarity in digital agents from an ethical
point of view, and whether and how designers can make responsible use of similarity. This ethical
reflection is urgent since research in social psychology and human-computer interaction reveals a
growing potential for enhancing persuasiveness of agents and possibilities for application in e.g.
commerce, education, and health-care are abundant. People give more trust to a character whose
face is digitally morphed to their face (DeBruine, 2002), regard digital agents that mimic their head
movements as more persuasive (Bailenson and Yee 2005), and perceive a chat robot that mimics
their response time as more intelligent (Kaptein et al. 2011).
In the paper, I develop guidelines for a responsible use of similarity, based on the following
main theoretical considerations. First, manipulation is prima facie wrong, a view which is supported
by many of the main ethical theories. Second, from social psychology I take the insight that often
social influence through similarity can be functional, or beneficial and serve a purpose for interacting
humans. Third, an agent can be interpreted as establishing a communication relation with its human
users. Therefore, it should meet legitimate expectations as elaborated in Habermas’ ethics of
communication (Habermas, 1992).
The following guidelines should, together, make sure that designers’ use of similarity in
digital agents is responsible.
Design guideline 1: If designers incorporate similarity in PT with the aim to design for usability, then it
should be possible to justify the way they proceed solely on the basis of usability considerations.
Design guideline 2: The degree of similarity incorporated in PT, either single or multiple types of
similarity, should not (significantly) exceed the degree of similarity between average humans
Design guideline 3: If designers incorporate similarity in PT, they should also include the functionality
which this type of similarity has in human-human interaction into their design.
References
1. Bailenson, Jeremy N., en Nick Yee. 2005. “Digital Chameleons Automatic Assimilation of
Nonverbal Gestures in Immersive Virtual Environments”. Psychological Science 16(10): 814–
819.
2. DeBruine, Lisa M. 2002. “Facial resemblance enhances trust”. Proceedings of the Royal
Society of London. Series B: Biological Sciences 269(1498): 1307–1312.
3. Fogg, B. 2003. Persuasive technology : using computers to change what we think and do.
Amsterdam;Boston: Morgan Kaufmann Publishers
4. Habermas, J, 1992, . Theory of Communicative Action Vol. 1: Reason and the Rationalisation
of Society: Beacon Press
5. Kaptein, M. et al. 2011. Two acts of social intelligence: the effects of mimicry and social
praise on the evaluation of an artificial agent. AI& Soc , 26:261-273
6. Maddux, W. W. et al. 2008. Chameleons bake bigger pies and take bigger pieces. Journal of
Experimental Social Psychology, (44): 461-468
Download