final letter . - HCI Community Discussion of Proposed

advertisement
Response to HHS-OPHS-2011-0005 by Members
of the Field of Human-Computer Interaction
October 25, 2011
Summary
We the undersigned, consisting of academics and practitioners in the field of Human Computer
Interaction (HCI), strongly support the goals of modernizing and revising current regulations for
protecting human subjects who participate in research, as expressed in HHS-OPHS-2011-0005, “Human
Subjects Research Protections: Enhancing Protections for Research Subjects and Reducing Burden,
Delay, and Ambiguity for Investigators.” We applaud the efforts to address the diversity of research
practices and disciplines subject to IRB rules.
We agree with the stated concern that, for the kinds of activities that are central to our practices, the
current regulations are “not adequately calibrating the review process to the risk of research” and
“some IRBs spend considerable time reviewing minimal risk research, and that some IRBs have a
tendency to overestimate the magnitude and probability of reasonably foreseeable risks” and “[t]he
nature of the possible risks to subjects is often significantly different in many social and behavioral
research studies as compared to biomedical research, and … the difference is not adequately reflected in
the current rules.” While we support the reforms proposed in the document, our comments below
focus on the questions relevant to these points as they most directly impact our work.
More precisely, we strongly support the proposed Excused category of research, and urge the addition
of certain common activities used in human-computer interaction research to the list of activities that
would qualify for this category, to be included explicitly alongside “educational tests, surveys,
interviews, and similar procedures.” We also support the proposed revisions to regulations regarding
expedited review and informed consent. Finally, we comment on the proposed changes regarding data
security and information risk.
Background
HCI is a field of study that involves designing and evaluating computer systems, services, applications
and interfaces. User interface design is critical to the successful use of technology, and a bedrock goal of
HCI is to improve the interactions between computers and people, resulting in more successful and
satisfactory experiences. Technology must be designed to take into account how people act and think,
to reduce errors, and increase efficiency and enjoyment of work. Interface design is an iterative
process, and requires working closely with prospective users, both to determine what they need and to
determine how successfully a design meets those needs. Therefore, research in the field of HCI
requires interaction with human participants to evaluate and improve the technology and the practices
under study.
1
HCI research draws on software engineering practice, and many HCI studies incorporate user needs
assessment, requirements analysis, design, prototyping, and usability testing.
HCI is an interdisciplinary field that integrates theories and methodologies from computer science,
information science, design, and social and cognitive sciences. It is closely related to the field of Human
Factors. Two professional societies that are prominent in this field are ACM SIGCHI, www.sigchi.org, and
the Usability Professionals Association www.upassoc.org . Although many of the undersigned are
members of these organizations, we are signing as individuals and do not represent these organizations.
The Excused Category of Research
Question 14: This question asks: Are these expansions in the types of studies that would qualify for this
Excused category appropriate? We strongly support the proposed Excused category. We believe these
proposed changes are entirely appropriate and will improve the timeliness of research and reduction of
paperwork without increasing risk to potential research participants. It is unlikely that the proposed
changes would reduce protections for research participants or discourage individuals from participation.
In particular:

We support the suggested practice of having researchers who plan to do work that qualifies for
the Excused category to submit a one page notice to their IRB board. More specifically, we
support the proposed process for filing an Excused study. We agree that, “the proposed auditing
requirement would provide institutions with information needed to assess their compliance with
the new Excused category without unnecessarily subjecting all such research to either
prospective review, or even routine review sometime after the study is begun.” This system
appropriately puts more responsibility for identifying riskier studies on the researcher.
Researchers should be expected to know and abide by their profession’s standards of ethical
behavior, and understand when a particular research study has exceeded the limits of the
Excused category. Additional training, code of conduct statements from professional societies,
and ongoing review of emerging methods by a federal panel (see below) would support these
standards.

We support the proposed consent rules for Excused research and encourage that “user needs
assessment, requirements analysis, design, prototyping, and usability studies” be added to the
list of practices subsumed in this statement: “… oral consent without written documentation
would continue to be acceptable for many research studies involving educational tests, surveys,
focus groups, interviews, and similar procedures.” These additions are necessary because some
institutional review boards may be unclear as to whether interacting with a computer or other
common electronic devices should be considered a separate category of methods.
Question 15: This question asks: Beyond the expansions under consideration, are there other types of
research studies that should qualify for the Excused category? We believe that human-computer
interaction studies involving user needs assessment, requirements analysis, design, prototyping, and
2
usability studies should be added to the Excused category. These are well-established practices that
usually involve virtually no risk to participants.
Question 17: This question asks which specific social and behavioral methodologies should be included
in the Excused category. It also asks under what circumstances deception should be allowed. We
support the following methods be included in the Excused category:

Asking participants to use computer systems or other technologies that are intended to be used
by end-users. This includes existing or experimental systems, prototypes, mockups, and
sketches. Assigning participants tasks when using those technologies, or asking them to develop
their own tasks. Asking users to analyze, critique or otherwise respond to these systems.

Brainstorming and other creative techniques for eliciting or refining user needs, usage scenarios
and system requirements; identifying trade-offs and benefits; exploring design alternatives; and
exploring alternative solutions to design problems, etc.

Monitoring or recording in conjunction with the above methods using a variety of capture
technologies. These should include, but not be limited to, recording: audio, video, screen
capture, interactions with the computer equipment such as mouse movements, touch screen
taps, and tracking of eye movement using eye trackers. (Note that current eye tracking
technology requires no physical connection with the participant.) This may be done locally or
remotely. This assumes that monitoring is done with participants’ knowledge and consent in the
context of a study.

Observation, measurement and analysis of the responses, behaviors and views of people for the
above methods. This assumes that observation is done with participants’ knowledge, or that
standards for collecting and protecting public behavior are being followed.
We feel that the above methods are appropriate for the Excused category because it is common for
people to use (and critique) software programs, web sites and other technology in everyday life. Like
surveys, focus groups, etc., these are “methodologies which are very familiar to people in everyday life
and in which verbal or similar responses would be the research data being collected.” These methods
involve benign interventions that pose virtually no risk to participants. These methods can be clearly
defined and operationalized to ensure that researchers and IRB’s interpret them consistently.

We support extending the mandate of the proposed standard federal panel
(http://www.federalregister.gov/articles/2011/07/26/2011-18792/human-subjects-researchprotections-enhancing-protections-for-research-subjects-and-reducing-burden#p-77). In
addition to reviewing and updating the list of research activities that qualify for Expedited
review on a regular schedule, we propose that the same panel be charged with reviewing and
updating the list of research studies that qualify for Excused status. This would provide a way for
new techniques to be added to the list as they mature and become widely accepted and
understood.
3

This panel should also be charged with articulating circumstances under which studies using
these methods should not qualify as Excused due to psychological or information risks. Two such
circumstances might be when particularly vulnerable populations are involved, or when devices
are tested for possible negative physical effects. Use of deception is another circumstance
where the panel might conclude that research using an ‘Excused’ method would warrant a
higher level of review. Deception is occasionally used in human-computer interaction research.
For instance, in order to develop a user interface for a web browser that will adequately warn a
user if they are about to use a Web site that has been phished, it may be necessary to deceive
users into believing they are viewing a legitimate web site when they are not. That said, we
believe studies using deception should be approved under an Expedited review process rather
than Excused.

This panel is particularly encouraged to begin developing guidelines for research using public
text and other data gathered on the Internet. IRB regulations have not kept up with the vast
increase in research using this kind of public data, and ethical standards vary enormously
between and even within institution. The issues in this area are numerous and complex, going
well beyond the current ANPRM, and warrant a more concerted effort involving a broad
representation of researchers and stakeholders.
Question 19: This question asks if there should be a brief waiting period, such as one week, before a
researcher may commence research after submitting the one-page registration form, to allow
institutions to look at the forms and determine if some studies should not be Excused. We suggest that
this be an option that is made available to institutions as an option should it suit their needs. It is useful
to give the institutions flexibility, but within some short limit, e.g. 1 week. Without that constraint, they
may feel the need to establish an arbitrarily long waiting period out of an overabundance of caution.
The rules should make clear that this is not the intent.
Question 22: We believe that a retrospective audit mechanism can effectively protect participants. A
properly designed and implemented procedure is unlikely to result in a greater burden for either the
researcher or the institution.
Question 29: This question asks: “… IRBs sometimes engage in activities beyond those that are required
by the regulations. For example, an IRB might review some studies for the purpose of determining
whether or not they qualify for exemption (the new Excused category), or might review studies involving
the analysis of data that is publicly available. Would it be helpful, in furtherance of increased
transparency, to require that each time an IRB takes such an action, it must specifically identify that
activity as one that is not required by the regulations?” We support requiring IRBs to report, in a public
forum, each time it takes an action outside of those required by the regulations, in order to reduce
occurrences of this to only those cases where there is a strong reason to depart from the regulations.
4
Expedited Review
Question 7: This question asks which research activities should be added to the published list of
activities that can be used in a study that qualifies for expedited review. We suggest that all activities
covered under the Excused categories should be added here.
Informed Consent
Question 35: The question asks What factors contribute to the excessive length and complexity of
informed consent forms, and how might they be addressed? We suggest that at least two factors
contribute to the complexity and length of informed consent forms. First, institutional caution leads
them to insert language that attempts to protect against even very low probability, low risk issues.
Second, it's easier to get approval for a protocol from an IRB if it looks like one they previously
approved, so investigators have an incentive to make as few changes as possible once they have an
approved form. They add sections to address new issues, without removing outdated sections. This
leads to more complex, less readable forms that are easier to get through IRB review. It might help to
provide specific guidance that the form must balance readability and brevity with appropriate
disclosure.
Question 37 - The proposed changes appear likely to improve the overall quality of consent forms and
we support them.
Data Security and Information Protection
In general, we support the proposal to strengthen data protections and reduce information risk. We
support the idea of separating data security issues from the IRB approval process, as IRB experts are not
necessarily data storage experts, nor are they intimately familiar with the evolving potential for
recovering individual identities from de-identified data. Providing a consistent standard that is “scaled
appropriately to the level of identifiability of the data” could improve protections for participants and
reduce redundant review by IRBs. We do note several concerns, though.
Question 59: The question asks in part, Are [HIPAA rules] appropriate not just for studies involving health
information, but for all types of studies, including social and behavioral research? The use of HIPAA as a
model may cause problems for HCI research because two data elements that are treated as personally
identifiable information are URLs and IP addresses. It appears that HIPAA does not distinguish between
URLs/IPs that could easily identify an individual (e.g. IP address of their computer) versus those that
wouldn't (e.g., URL or IP address of a web site). There may be a need for separate guidelines for
anonymization of electronically logged data from non-medical research. ,
We anticipate a need for more specific regulation regarding data sharing between research groups. It
seems likely that the rules for data sharing will be harder to write than for data storage and protection.
AOL's release of inadequately anonymized search data in 2006 illustrates the challenge of truly deidentifying data for public release. Although it is desirable to share data from HCI research, many studies
do not need to do so, and could be inadvertently limited if data sharing standards and data release
5
standards are conflated. For that reason, we encourage HHS to clearly distinguish between data
storage/protection and data release in its rulemaking.
Signatures
Name
Title
Affiliation
1
Bill Kules
Assistant Professor
The Catholic University of
America, School of Library and
Information Science
2
Nathan Bos
Senior Research Associate
Johns Hopkins University
Applied Physics Laboratory
3
Scott Klemmer
Associate Professor
Stanford University
4
Marti Hearst
Professor
University of California,
Berkeley
5
Ryan Aipperspach
Director of Engineering and Design
GoodGuide.com
6
Cecilia Aragon
Associate Professor
University of Washington
7
Martin Schedlbauer
Clinical Professor
Northeastern University
8
Louise Barkhuus
Dr.
9
Juan Pablo
Hourcade
Assistant Professor
University of Iowa
10
Mark W. Newman
Assistant Professor
University of Michigan
11
Dragomir Radev
Professor
University of Michigan
Dean and Professor
School of Information and
Library Science, UNC-Chapel
Hill
12
Gary Marchionini
6
13
Loren Terveen
Professor
University of Minnesota
14
Bradley Hemminger
Associate Professor
University of North Carolina
15
Zachary O. Dugas
Toups
Dr.
TEEX/TCAT Crisis Response
Innovative Technologies Lab
16
Thomas A. Finholt
Professor and Associate Dean
School of Information,
University of Michigan
17
David Mendonca
Ph.D.
18
Paul Conway
Associate Professor
University of Michigan
19
Terry Winograd
Professor of Computer Science
Stanford University
20
Hong Zhang
Assistant Professor
University of Kentucky
21
James (Bo) Begole
Principal Scientist
Palo Alto Research Center
22
Brad A. Myers
Professor
Carnegie Mellon University
23
Serge Egelman
24
Nicholas J. Belkin
Professor
Rutgers University
25
Doug Bowman
Associate Professor
Virginia Tech
26
Jeffrey Bigham
Assistant Professor
University of Rochester
27
Chris North
Associate Professor
Virginia Tech
UC Berkeley
7
28
Susanne Jul
President/CEO
Amaryllis Consulting, LLC
29
Jacob O. Wobbrock
Associate Professor
University of Washington
30
Ed H. Chi
Staff Research Scientist
Google Research
31
Josh Tenenberg
Professor
University of Washington
32
Bjoern Hartmann
Assistant Professor
University of California,
Berkeley - Computer Science
Division
33
Sameer Patil
Postdoctoral Research Fellow
Indiana University
34
Andrea Kavanaugh
Senior Research Scientist
Virginia Tech
35
Harry Hochheiser
Assistant Professor
University of Pittsburgh
36
Mihaela
Vorvoreanu
Assistant Professor
Purdue University
37
Joel Ross
PhD Candidate
University of California, Irvine
38
Dan Jurafsky
Professor
Stanford University
39
Clayton Lewis
Professor of Computer Science
University of Colorado
40
David Wagner
Professor
University of California,
Berkeley
41
Ben Bederson
Professor
University of Maryland
42
Shumin Zhai
8
43
Ruy Cervantes
University of California, Irvine
44
Jennifer Golbeck
University of Maryland,
Human-Computer Interaction
Lab
45
Aditi Shrikumar
Ph.D. Candidate
University of California,
Berkeley
46
Eelke Folmer
Assistant Professor
University of Nevada, Reno
47
Derek Hansen
Assistant Professor
Brigham Young University
48
Jon Froehlich
Graduate Student
University of Washington
49
Scott Robertson
Associate Professor of Information and
Computer Sciences
University of Hawaii at Manoa
50
Clifford Nass
Professor
Stanford University
51
James D. Hollan
Professor
UCSD
52
Frank Ritter
Prof.
Penn State
53
Athena Hoeppner
Electronic Resources Librarian
University of Central Florida
54
Amy Bayes
Human Systems Engineer/Training
Systems
Johns Hopkins University
Applied Physics Laboratory
55
Janet Deery
Usability Engineer
Johns Hopkins University
Applied Physics Laboratory
56
Elle He
57
Razvan Orendovici
Research Assistant
PSU
9
58
James Hays
Assistant Professor
Brown University
59
Mandy Natter
Senior Professional Staff
Johns Hopkins University
Applied Physics Laboratory
60
S. Shyam Sundar
Distinguished Professor
Penn State University
61
Christine Robson
Researcher
IBM Research - Almaden
62
Al Liikkanen
Stanford University
63
Helena Mentis
Microsoft Research
64
Catherine Dumas
PhD student
SUNY Albany
65
James Lin
Software engineer
Google
66
Aaron Houssian
67
Janet Davis
Assistant Professor of Computer Science
Grinnell College
68
Paul Dourish
Professor
University of California, Irvine
69
William Fitzpatrick
Senior Human Factors Scientist
JHU/APL
70
Greg Boone
MA Student
Georgetown University
71
Celine Latulipe
Assistant Professor
UNC Charlotte
72
Wendy Kellogg
Philips Research
10
73
Rebecca Fiebrink
Assistant Professor
Princeton University
74
Gregory Norcie
Doctoral Student
Indiana University
75
Steve Howard
Professor
The University of Melbourne
76
Jofish Kaye
Senior Research Scientist
Nokia Research Center, Palo
Alto
77
Andy Dearden
Professor of Interactive Systems Design
Sheffield Hallam University, UK
78
Christopher
Hoadley
Assocate Professor and Program
Director
New York University
79
Matthew Bietz
Project Scientist
University of California Irvine
80
Matthew Bietz
Assistant Project Scientist
University of California Irvine
81
Yong Ming Kow
Postdoctoral Researcher
UC Humanities Research
Institute
82
James Landay
Professor
CSE, University of Washington
83
Alan Wexelblat
84
Shaun Kane
Assistant Professor
University of Maryland
Baltimore County
85
Bree McEwan
Assistant Professor
Western Illinois University
86
Lorrie Faith Cranor
Associate Professor
Carnegie Mellon University
87
Heather Leary
Dr.
University of Colorado-Boulder
11
88
A.J. Brush
Senior Researcher
Microsoft Research
89
Ed H. Chi
Staff Research Scientist
Google Research
90
Jeff Huang
PhD student
University of Washington
91
Michael Zarro
Ph.D. Student
Drexel University
92
Julie Kientz
Assistant Professor
University of Washington
93
Michael J Cole
94
Michael Bernstein
Graduate Student
MIT
95
Andrea Forte
Assistant Professor
Drexel University
96
Francisco Servant
PhD Student
University of California, Irvine
97
Edward Alan Fox
Professor
Virginia Tech
98
Nicholas F. Polys
Director of Visual Computing
Virginia Tech
99
Terry Winograd
Professor of Computer Science
Stanford University
100
Micah Lande
Assistant Professor
Arizona State University
101
Andreas Paepcke
Senior Research Scientist
Stanford University
102
Sara Kiesler
Hillman Professor of Computer Science
and Human Computer Interaction
Carnegie Mellon University
Rutgers University
12
103
Steven Dow
Assistant Professor
Carnegie Mellon
104
Justine Cassell
Director, HCII
CMU
105
Mark D Gross
Professor
Carnegie Mellon
106
Robert Capra
Assistant Professor
University of North Carolina at
Chapel Hill
107
Daniel Horn
Ph.D.
108
Maneesh Agrawala
Associate Professor
UC Berkeley
109
Dominik Kaeser
Technical Director
Berkeley Alumnus, Pixar
Animation Studios
110
Peter Brusilovsky
Professor
University of Pittsburgh
111
Peggy Chi
112
Nicholas Kong
113
Kimiko Ryokai
Assistant Professor
University of California,
Berkeley
114
Manas Mittal
115
Andy Carle
PhD Candidate
University of California,
Berkeley
116
Celeste Roschuni
Doctoral Candidate
University of California at
Berkeley
117
Gopinaath
Kannabiran
UC Berkeley
13
118
William Ryan
119
Elizabeth Buie
Assistant Professor
Ithaca College
120
Wesley Willett
Ph.D. Candidate
University of California,
Berkeley - Computer Science
Division
121
Joseph L Gabbard
Assistant Reseach Professor
Virginia Tech
122
Michael Christel
Research Professor
Carnegie Mellon University
123
Elizabeth Thiry
PhD Candidate
Pennsylvania State University
124
Mudit Mittal
125
Samantha Merritt
126
Melissa Rodriguez
127
Rachel K. E.
Bellamy
128
Karl Fast
129
Tarun Gangwani
130
Bonnie E. John
Professor
Carnegie Mellon University
131
Reid Priedhorsky
Postdoc
Los Alamos National
Laboratory
132
Alice Agogino
Professor
University of California at
Berkeley
Indiana University
Graduate Student
Indiana University
Assistant Professor
Kent State University
Indiana University
14
133
Justin Donaldson
Dr.
Informatics
134
Laura Devendorf
PhD Student
UC Berkeley iSchool
135
Leif Berg
Research Assistant
136
Tapan Parikh
Assistant Professor
UC Berkeley
137
Susan Dray
President/CEO
Dray & Associates, Inc.
138
Scott Hudson
Professor
Carnegie Mellon University
139
Ryan Allen Kirk
HCI Researcher
Iowa State University
140
Haakon Faste
Visiting Assistant Professor
141
Karen Lin
142
Jason Hong
143
Ken Koedinger
144
Associate Professor
Carnegie Mellon University
Matthew Kam
Assistant Professor
Carnegie Mellon University
145
Noboru Matsuda
Systems Scientist
Carnegie Mellon University
146
Daniel P. Siewiorek
University Professor
Carnegie Mellon University
147
Kathleen Surfus
15
148
William E. Marsh
Ph.D. Candidate
Iowa State University
149
Roni Rosenfeld
Professor
Carnegie Mellon University
150
Jodi Forlizzi
151
Chung-Ching
Huang
152
Sanjay Kairam
Graduate Student
Stanford University
153
James Herbsleb
Professor
Carnegie Mellon University
154
Robert Kraut
Herbert A. Simon Professor of HumanComputer Interaction
Carnegie Mellon University
155
Valkyrie Savage
PhD Student
UC Berkeley
156
Eric Paulos
Associate Professor
Carnegie Mellon University
157
Pablo Paredes
PhD Student
UC Berkeley
158
Kurtis Heimerl
159
Mike Kuniavsky
CEO
ThingM Corp
160
Stefan P. Carmien
Staff Scientist
ACM SIGCHI, RESNA
161
Ian Roberts
Interaction Designer
162
Oriana Love
Researcher
UC Berkeley
16
163
Eric Bell
NLP Scientist
164
Jean Scholtz
Chief Scientist, HCI researcher
165
Russ Burtner
Senior User Experience Research
Scientist
166
Michael Madison
Web Application Designer
167
Jennifer A. Rode
17
Download