1 Evaluating physical discomfort associated with

advertisement
Proceedings 19th Triennial Congress of the IEA, Melbourne 9-14 August 2015
Evaluating physical discomfort associated with prolonged, repetitive use of gesture:
a comparison of subjective rating protocols
Minseok Son, Haeseok Jung, Hyunkyung Yang, Woojin Park
Department of Industrial Engineering, Seoul National University, Seoul, South Korea
1. Objective
The main objective of this study was to empirically determine the number of gesture repetitions needed to
obtain subjective ratings data that reflect physical discomforts associated with prolonged, repetitive use of
gestures.
2. Background
As gesture recognition technologies become more mature and available, three-dimensional gestures
involving upper-extremity postures and movements are increasingly being utilized for human–machine
interactions (HMI) (Lai et al., 2012; Qian et al., 2013; Mauser and Burgert, 2014). Currently, applications of
such gesture-based interaction include machine/device control, data exploration in virtual reality, artistic
creation, musical instrument playing, games, etc. (Bleiweiss et al., 2010; Gerling et al., 2012; Soltani et al.,
2012; Alvarez-Santos et al., 2014; Bai et al., 2014; Liu et al., 2014; Lu et al., 2014; Posti et al., 2014). Some
of these applications accompany prolonged and repetitive use of 3D UE gestures. Prolonged, repetitive use
of 3D UE gestures may give rise to physical discomforts especially when the designed gestures are awkward
and stressful. It is also known to increase the risk of musculoskeletal disorders (Dul et al., 1994; Milner,
1986; Nag, 1991; Putz-Anderson and Galinsky, 1993; Kee and Lee, 2012; Hamberg-van Reenen et al.,
2008).
Several previous studies on gesture design have used the method of subjective discomfort rating to
quantify physical discomforts associated with the use of gestures (Nielsen et al., 2004; Stern et al., 2006;
Stern et al., 2008; Wachs et al., 2008; Wobbrock et al., 2009; Fikkert et al., 2010; Morris et al., 2010; Chae et
al., 2012; Choi et al., 2012; Piumsomboon et al., 2013); however, these studies did not precisely delineate
the method - none of the previous studies clearly specified the details, including the number of repetitive
gesture executions needed prior to each subjective rating and the pace. It is suspected that without clear
instructions, the human evaluators in these studies performed each gesture once or a few times prior to the
subjective rating. As one execution or a few repetitions of a gesture may not be a significant physical burden,
a subjective rating score obtained in this manner may not properly represent physical discomforts resulting
from prolonged, repetitive use of gesture. A certain large number of gesture repetitions may be needed to
elicit such discomfort. Currently, it is not known how many repetitions are needed.
3. Method
Four subjective rating protocols differing in the number of gesture repetitions (1, 5, 10 and 20) prior to rating
were examined, and these were termed the Discomfort Rating after 1, 5, 10, 20 executions (DRafter1,
DRafter5, DRafter10, DRafter20), respectively. Forty participants (ten for each protocol) evaluated physical
discomforts of twenty 3D UE gestures. To comparatively evaluate the four protocols in terms of their utility, a
baseline that represent gestures’ discomforts resulting from prolonged, frequent use were needed. In this
study, a RULA-based measure and an empirical measure were devised and used as baselines for evaluation.
The RULA-based measure quantifies discomfort levels of designed gestures on the basis of Rapid
Upper Limb Assessment (RULA), which is a widely used ergonomics postural stress analysis tool
(McAtamney and Corlett, 1993), and thus, is termed the RULA-based gestural discomfort evaluation (RGDE)
measure. For the twenty gestures, the authors computed its RGDE scores.
The empirical measure was also provide such baseline measurements – in order to obtain a baseline
measurement for a given gesture, a human subject, starting from a rested condition, successively repeats
the gesture at a natural pace up until the level of perceived discomfort has reached a predetermined
1 Proceedings 19th Triennial Congress of the IEA, Melbourne 9-14 August 2015
significant discomfort level, say, 5 (“strong”) on the 10-point scale. This number of repetitions prior to
significant discomfort (shortly abbreviated as NRPSD or NRP5 if the predetermined discomfort level is 5 on
the 10-point scale) becomes the baseline measurement. The NRPSD measure indicates how many times a
gesture can be successively repeated before giving rise to significant discomfort, and, allows comparing
different gestures’ discomfort levels in the context of prolonged, frequent use of gesture; a more stressful
gesture results in a smaller value of NRPSD than a less stressful one. Sixteen participants in their 20s and
30s (13 males and 3 females) repeated each of the twenty gestures and the NRP5s were determined.
Then, correlation analyses were performed to examine the relationship among four subjective rating
protocols, RGDE and NRP5.
4. Results
The mean DRafter1, mean DRafter5, mean DRafter10, mean DRafter20, RGDE scores and mean NRP5s
determined for each of the twenty gestures were generated.
The intercorrelations among the measures were provided in Table 1. All values were statistically
significant (p-value<0.05).
Table 1. The results of correlation analyses
Measure
RGDE Score
Mean NRP5
Mean DRafter1
0.68
-0.85
Mean DRafter5
0.70
-0.61
Mean DRafter10
0.75
-0.74
Mean DRafter20
0.83
-0.84
5. Conclusion
This study empirically investigated the number of gesture executions required prior to subject discomfort
rating for accurate estimation of physical discomfort from prolonged, repetitive use of 3D UE gestures. The
study’s findings indicate the DRafter20 measure has potential to be useful. The DRafter20 measure also has
the advantage that data collection is not time-consuming.
The DRafter20 measure may be useful for a variety of prolonged, repetitive physical tasks aside from
gestural interaction. A future investigation on its applicability to other physical tasks seems to be warranted.
References
Alvarez-Santos, V., Iglesias, R., Pardo, X. M., Regueiro, C. V., & Canedo-RoDIRguez, A. 2014. “Gesture-based
interaction with voice feedback for a tour-guide robot.” Journal of Visual Communication and Image Representation,
25(2), 499-509.
Bai, H., Lee, G. A., Ramakrishnan, M., & Billinghurst, M. 2014. “3D gesture interaction for handheld augmented reality.”
In SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications (p. 7). ACM.
Bleiweiss, A., Eshar, D., Kutliroff, G., Lerner, A., Oshrat, Y., & Yanai, Y. 2010. “Enhanced interactive gaming by blending
full-body tracking and gesture animation.” In ACM SIGGRAPH ASIA 2010 Sketches, 34.
Chae J., Lee S., & Cho K. 2012. “A study of gesture vocabulary design for MP3 player depending on situations.” HCI
Conference, 653-656.
Choi, E., Kwon, S., Lee, D., Lee, H., & Chung, M. 2012. “Design of Hand Gestures for Smart Home Appliances based on
a User Centered Approach.” Journal of the Korean Institute of Industrial Engineers, 38(3), 182-190.
Dul, J., Douwes, M. and Smitt, P. 1994. “Ergonomic guidelines for the prevention of discomfort of static postures on
endurance data.” Ergonomics, 37(5), 807-815.
Fikkert, W., van der Vet, P., & Nijholt, A. 2010. “Gestures in an Intelligent User Interface.” In Multimedia Interaction and
Intelligent User Interfaces, 215-242.
Gerling, K., Livingston, I., Nacke, L., & Mandryk, R. 2012. “Full-body motion-based game interaction for older adults.” In
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1873-1882.
Hamberg-van Reenen, H. H., van der Beek, A. J., Blatter, B. M., van der Grinten, M. P., van Mechelen, W., & Bongers, P.
M. 2008. “Does musculoskeletal discomfort at work predict future musculoskeletal pain?” Ergonomics, 51(5), 637-648.
Kee, D., & Lee, I. 2012. “Relationships between subjective and objective measures in assessing postural stresses.”
Applied Ergonomics, 43(2), 277-282.
Lai, K., Konrad, J., & Ishwar, P. 2012. “A gesture-DIRven computer interface using Kinect.” In Image Analysis and
Interpretation (SSIAI), 2012 IEEE Southwest Symposium, 185-188.
2 Proceedings 19th Triennial Congress of the IEA, Melbourne 9-14 August 2015
Liu, Y., Dong, C., Zhang, M., Chen, X., & Li, Y. 2014. “A Novel Music Player Controlling Design Based on Gesture
Recognition.”
Lu, Z., Chen, X., Li, Q., Zhang, X., & Zhou, P. 2014. “A Hand Gesture Recognition Framework and Wearable GestureBased Interaction Prototype for Mobile Devices.”
Mauser, S., & Burgert, O. 2014. “Touch-Free, Gesture-Based Control of Medical Devices and Software Based on the
Leap Motion Controller.” Medicine Meets Virtual Reality 21: NextMed/MMVR21, 196, 265.
McAtamney, L., Corlett, E.N. 1993. “RULA: a survey method for the investigation of work-related upper limb disorders.”
Appl. Ergon. 24(2), 91-99.
Milner, N. 1986. “Modelling fatigue and recovery in static postural exercise.” Clinical Biomechanics, 1(1), 29.
Morris, M. R., Wobbrock, J. O., & Wilson, A. D. 2010. “Understanding users' preferences for surface gestures.” In
Proceedings of graphics interface 2010, 261-268.
Nag, P. K. 1991. “Endurance limits in different modes of load holding.” Applied ergonomics, 22(3), 185-188.
Nielsen, M., Störring, M., Moeslund, T. B., & Granum, E. 2004. “A procedure for developing intuitive and ergonomic
gesture interfaces for HCI.” In Gesture-Based Communication in Human-Computer Interaction, 409-420.
Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. 2013. “User-defined gestures for augmented reality.” In
Human-Computer Interaction–INTERACT 2013, 282-299.
Posti, M., Ventä-Olkkonen, L., Colley, A., Koskenranta, O., & Häkkilä, J. 2014. “Exploring Gesture Based Interaction with
a Layered Stereoscopic 3D Interface.” In Proceedings of the International Symposium on Pervasive Displays (p. 200).
ACM.
Putz-Anderson, V., & Galinsky, T. L. 1993. “Psychophysically determined work durations for limiting shoulder girdle
fatigue from elevated manual work.” International Journal of Industrial Ergonomics, 11(1), 19-28.
Qian, K., Niu, J., & Yang, H. 2013. “Developing a Gesture Based Remote Human-Robot Interaction System Using
Kinect.” International Journal of Smart Home, 7(4).
Soltani, F., Eskandari, F., & Golestan, S. 2012. “Developing a gesture-based game for deaf/mute people using Microsoft
kinect.” In Complex, Intelligent and Software Intensive Systems (CISIS), 2012 Sixth International Conference, 491495.
Stern, H. I., Wachs, J. P., & Edan, Y. 2006. “Human factors for design of hand gesture Human-Machine Interaction.” In
Systems, Man and Cybernetics, 2006. SMC'06. IEEE International Conference, Vol. 5, 4052-4056.
Stern, H. I., Wachs, J. P., & Edan, Y. 2008. “Designing hand gesture vocabularies for natural interaction by combining
psycho-physiological and recognition factors.” International Journal of Semantic Computing, 2(01), 137-160.
Wachs, J., Stern, H., & Edan, Y. 2008. “A Holistic Framework for Hand Gestures Design.” In Proceedings of 2nd Annual
Visual and Iconic Language Conference, 24-34.
Wobbrock, J. O., Morris, M. R., & Wilson, A. D. 2009. “User-defined gestures for surface computing.” In Proceedings of
the SIGCHI Conference on Human Factors in Computing Systems, 1083-1092.
3 
Download