Laughter: Interaction and Perception

advertisement
Dr. Harry J Griffin - UCL Interaction Centre
Laughter: interaction and perception
These projects are related to the ILHAIRE project (www.ilhaire.eu)
Laughter is a ubiquitous and complex social signal that remains relatively under-investigated in
psychology and human-computer interaction (HCI). The ILHAIRE project aims to bridge the gap
between our knowledge of the psychology of laughter and its use by avatars, thus creating natural
sounding and looking, sociable conversational agents. At UCLIC we are investigating the body
movements associated with laughter, laughter contagion in pairs and larger groups, and how
people’s perception of the quality of laughter is influenced by the laughter’s parameters and the
context. Previous studies at UCLIC have revealed body movements that are significant in
determining the perceived nature of the laughter. We have also investigated how haptic devices
might serve as a physical channel for communicating laughter.
We offer one or two projects which will extend this work and break new ground in our
understanding of laughter. The details can be discussed but possible ideas include:





Modelling laughter contagion in conversation and human-computer interaction.
Perception of emotional laughter quality in multimodal displays.
Perception of laughter intensity in multimodal displays.
Tolerance to multimodal desynchrony (lag between, visual, auditory and haptic channels) in
laughter perception.
Effects of laughter on ongoing task-related movements.
Some knowledge of MatLab would be beneficial, as would solid statistical skills. Good interpersonal
skills will also be important for creating an atmosphere conducive to laughter in data collection
sessions.
Suggested reading:
DiLorenzo, P. C., Zordan, V. B., Sanders, B. L. (2008). Laughing Out Loud : Control for modeling anatomically inspired
laughter using audio. ACM Transactions on Graphics 27, 1.
Fukushima, S., Hashimoto, Y., Nozawa, T., Kajimoto, H. (2010). Laugh enhancer using laugh track synchronized with the
user's laugh motion. In Proceedings of the 28th International Conference on Human Factors in Computing Systems (CHI
'10), pp. 3613-3618.
Kleinsmith, A., and Bianchi-Berthouze, N. Recognizing affective dimensions from body posture, Proceedings of the 2nd
International Conference on Affective Computing and Intelligent Interaction, LNCS:4738, Springer-Verlag, pp. 48-58, Lisbon,
Portugal, September, 2007.
Urbain, J., Niewiadomski, R., Bevacqua, E., Dutoit, T., Moinet, A., Pelachaud, C., Picart, B., Tilmanne, J., Wagner, J. (2010)
AVLaughterCycle. Enabling a virtual agent to join in laughing with a conversational partner using a similarity-driven
audiovisual laughter animation, Journal of Multimodal User Interfaces, vol. 4, n. 1, pp. 47-58.
Roether, C.L., Omlor, L., Christensen, A., and Giese, M.A. Critical features for the perception of emotion from gait. Journal
of Vision, 9(6):15, (2009) 1–32, http://journalofvision.org/9/6/15/, doi:10.1167/9.6.1
Ruch, W., & Ekman, P. The expressive pattern of laughter. In A. W. Kaszniak (Ed.), Emotion, qualia, and consciousness (pp.
426-443). Tokyo: Word Scientific Publisher. 2001
Download