Back to the real world: Tangible interaction for design GUEST EDITORIAL

advertisement
1
Artificial Intelligence for Engineering Design, Analysis and Manufacturing (2009), 23, 221–223. Printed in the USA.
Copyright # 2009 Cambridge University Press 0890-0604/09 $25.00
doi:10.1017/S0890060409000195
62
63
2
64
3
4
GUEST EDITORIAL
65
5
Back to the real world: Tangible interaction for design
66
6
67
7
68
8
69
9
70
10
71
72
11
12
ELLEN YI-LUEN DO1 AND MARK D. GROSS2
73
13
1
74
14
ACME Lab, College of Architecture and College of Computing, Georgia Institute of Technology, Atlanta, Georgia, USA
2
Computational Design Lab, School of Architecture, Carnegie Mellon University, Pittsburgh, Pennsylvania, USA
75
15
76
16
77
17
78
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
commercial personal computers shadowed other forms of
interacting with computers such as the pen and TUIs. The
development of TUIs based on augmented reality and embedded computing proceeded independently, and it was many
years before the HCI community rediscovered these early
efforts (Sutphen et al., 2000).
Coupling physical features to digital information can be
perceptually mediated by human senses (i.e., sight, touch,
smell, etc.), leveraging the affordances of things (blocks are
stackable, balls are bouncy) and control mechanisms (squeezing toothpaste out of the tube or turning a knob). Before the
days of fast CPUs programmers could read the changing patterns of lights on the mainframe console (known colloquially
as “das Blinkenlights”) to help debug their code, an early instance of bringing processes within the machine into the
physical realm. More recent tangible interaction research
makes virtual objects “graspable” (Fitzmaurice et al., 1995) by
using physical “bricks” as handles and letting people “take
the digital out” into the real world to manipulate it physically.
Blocks, for example, are a popular physical form for tangible interaction. We rotate Navigational Blocks (Camarata
et al., 2002) to query digital content in a tourist spot; flip
and rotate blocks to scroll a map on Z-Agon (Matsumoto
et al., 2006); stack up Computational Building Blocks (Anderson et al., 2000) to model three dimensions; or snap together Active Cubes (Watanabe et al., 2004) to interact
with the virtual world. Alternatively, consider projection:
projection and spatial augmented reality have been employed
with multitouch surfaces and computer vision in reacTable (Jorda et al., 2007) for hands-on musical collaboration
or in bringing dinosaurs to life in Virtual Showcase (Bimber
et al., 2003). Coming full circle is the “rich interaction camera”
(Frens, 2006) that applies the form, interaction, and functionality of a conventional 35-mm film camera to improve the
usability of its digital counterpart.
Since its first issue, AI EDAM has published the best work
at the frontiers of engineering design and computing; today,
tangible interaction is one of those frontiers. At first glance
tangible interaction might seem a mere conceit: interfaces
are inherently superficial. However, new input and output
1. INTRODUCTION
1.1. What is tangible interaction? Why should
we care?
After several decades in which design computing has been almost exclusively the domain of software, today, many investigators are building hybrid systems and tools that in one way
or another bridge the divide between physical “real-world”
artifacts and computational artifacts. On the one hand, the
rise and popularity of mass customization, rapid prototyping,
and manufacturing raises questions about the kinds of software systems and tools that will make these hardware technologies useful in designing. In contrast, advances in microcontroller and communications technologies have led to a wave
of embedding computation in physical artifacts and environments. Both are described in Gershenfeld’s (1999, 2005) popular books.
Tangible interaction is a growing field that draws technology
and methods from disciplines as diverse as human–computer
interaction (HCI), industrial design, engineering, and psychology. If the idea of ubiquitous computing (Weiser, 1991)
is computation integrated seamlessly into the world in
different forms for different tasks, “tangibility” gives physical
form and meaning to computational resources and data. The
HCI community terms this seamless interaction variously
“tangible bits” (Ishii & Ullmer, 1997) or “embodied interaction” (Dourish, 2001). Tangible interaction is simply coupling digital information to physical representations. A tangible user interface (TUI) extends the traditional graphical user
interface on screens into the everyday physical world to realize the old goal of “direct manipulation” of data (Shneiderman, 1983).
Aish (1979) and Frazer et al. (1980) were pioneers in tangible interaction for design; both developed instrumented
physical objects as input devices for computer-aided design.
However, the widespread adoption of the mouse in the first
58
59
60
61
Reprint requests to: Ellen Yi-Luen Do, ACME Lab, College of Architecture and College of Computing, Georgia Institute of Technology, Atlanta,
GA 30332-0155, USA. E-mail: ellendo@gatech.edu
221
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
222
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
modes are the technological force behind tangible interaction,
and in the history of computing new input and output modes
have yielded big changes: the move from batch processing to
personal computing, high-resolution bitmap graphics, and the
mouse. Today, embedded computing, low-cost video cameras and projection, novel sensors, and new responsive materials drive new modes of interaction. As this Special Issue
shows, these new modes of interaction have the potential—
in conjunction with previous research on artificial intelligence in engineering design, analysis, and manufacture—to
change how we practice, teach and learn, and investigate
engineering design.
This Special Issue of AI EDAM populates the space of
hybrid computational–physical systems with articles specifically about the use of tangible interaction in design. The seven articles accepted for this Special Issue are the design cousins of the other tangible interaction work mentioned above.
The articles here fall in three categories: the lay of the land: a
survey of the field that serves as a lens to look at existing
work; teaching with tangibles: tangible interaction in design
education; and tangible tools: tangible applications and tools
for designing.
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
2. THE LAY OF THE LAND
In the first article, “Framing Tangible Interaction Frameworks,” Mazalek and van den Hoven take a broad look at
the field of tangible interaction. In earlier days, individual
systems and their particular characteristics defined the field.
As the field matured, researchers continued to build interesting and innovative interfaces, but they also began to identify
frameworks within which to position their work. Now researchers in the field have a broad array of tangible interaction
frameworks to choose from. Mazalek and van den Hoven’s
article orders the diverse array of frameworks that have arisen
over the past decade or so of research on TUIs.
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
3. TEACHING WITH TANGIBLES
The next two articles offer insights and reflections on tangible
interaction in design education. In “Tangible Interactions in a
Digital Age: Medium and Graphic Visualization in Design
Journals,” Oehberg, Lau, and Agogino examine the changing
nature of design journals as digital and tangible media begin
to mix. The article describes a study of students’ design journals in UC Berkeley’s Mechanical Engineering Project
course. Some students keep traditional paper-and-pencil journals, whereas others keep their journals in a digital online
form; still others adopt a hybrid format. Their study, which
spans journals from 4 years of the project course, developed
protocols for analyzing the design content in the journals as
well for coding the use of media.
Shaer, Horn, and Jacob’s article, “Tangible User Interface
Laboratory: Teaching Tangible Interaction Design in Practice,” describes the structure and implementation of an introductory university-level course in tangible interaction, draw-
E.Y.-L. Do and M.D. Gross
ing on the authors’ experience over several years at Tufts
University’s Computer Science Department. They describe
strategic decisions they took in designing the course and
some consequences of these decisions. Their article also includes examples of tangible interaction designs that students
in the course produced.
184
185
186
187
188
189
190
191
4. TANGIBLE TOOLS
192
193
The last four articles report on tangible tools for design. The
first article gives a glimpse of how researchers at an industry
laboratory engage in prototyping a tangible tool. The second
describes a tool that projects video onto physical artifacts to
provide visualize patterns and textures; this facilitates realtime feedback for collaboration and communication among
stakeholders. The third looks at the paradigm of projection
based interactive augmented reality to explore opportunities
for the use of such tool. The fourth and last article concerns
using tangible tools to foster children’s story telling and movie making.
“Prototyping a Tangible Tool for Design: Multimedia
E-Article Sticky Notes” by Back, Matsumoto, and Dunnigan
presents a tangible sticky note prototype device with display
and sensor technology. The idea is that people can use tangible e-paper tools to sort, store, and share information with the
combined affordances of physical features and digital handling of visual information. The authors describe the design
and development of different interaction modalities (e.g.,
flipping, sticking, stacking) to control a multimedia memo
system and suggest applications in project brainstorming
and storyboarding for design. The article also shows how to
use today’s off-the-shelf technology to prototype behaviors
of tomorrow’s technology.
In “A Tangible Design Tool for Sketching Materials in Products” Saakes and Stappers describe a tool they built that uses
augmented prototyping for ceramics design, projecting twodimensional imagery onto physical shapes. Their tool enables
ceramics designers to quickly and fluidly experiment with the
surface features and graphic design of ceramic products such
as plates, bowls, and tiles. Their article describes the task the
tool supports, the tool itself, and experiences with master
designers using the tool in practice.
“Analyzing Opportunities for Using Interactive Augmented Prototyping in Design Practice” by Verlinden and Horvath
outlines the prospects for using augmented reality systems
and rapid prototyping in industrial design. They examined
data from three case studies in different domains (the design
of a tractor, an oscilloscope, and a museum interior) from
which they derived a set of hints or guidelines for adopting
interactive augmented prototyping in industrial design.
In “Play-It-By-Eye! Collect Movies and Improvise Perspectives With Tangible Video Objects,” Vaucelle and Ishii
report on a series of iterative design case studies. They describe four video editing systems they built for children that
use tangible interaction in various ways, embedding movie
making as part of play. For each system they describe the
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
Guest editorial: Back to the real world
245
246
247
248
underlying design motivation and the specific implementation, their experience testing the design with children, and
the lessons they learned, which informed the next design
iteration.
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
5. THE LARGER LANDSCAPE
These seven articles represent a spectrum of different perspectives, yet they hardly cover the entire landscape of tangible interaction for design. For example, we have no representative from “programmable matter” or modular robotics,
research that is sure to have profound implications for tangible interaction and design (Goldstein et al., 2005). Nor have
we a paper from the active community of do-it-yourself engineering, in which the availability of new materials and desktop manufacturing technologies is enabling citizens to design
and build quite complex engineered systems (Buechley et al.,
2008). Despite Gershenfeld’s promise of “things that think”
(2000), most work to date on tangible interaction for design
has little, if any, artificial intelligence. Perhaps a future issue
will include these and other aspects of tangible interaction in
design.
The 7 articles here were selected from 17 submissions with
two rounds of review. In the first round, all submitted articles
were blind reviewed by multiple reviewers. (Each article received at least three reviews, and 9 articles received five or
more reviews.) In the second round, the Guest Editors reviewed and edited the revised manuscripts. We thank all of
the authors and reviewers for their diligent work and timely
responses. We also thank Editor-in-Chief Professor David
C. Brown for giving us the opportunity to edit this Special
Issue and Cambridge Senior Project Managing Editor Nancy
BriggsShearer for taking care of logistics.
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
REFERENCES
Aish, R. (1979). 3D input for CAAD systems. Computer-Aided Design 11(2),
66–70.
Anderson, D., Frankel, J.M., Marks, J., Agarwala, A., Beardsley, P., Hodgins, J., Leigh, D., Ryall, K., Sullivan, E., & Yedidia, J.S. (2000). Tangible interaction þ graphical interpretation: a new approach to 3D modeling. Proc. 27th Conf. Computer Graphics and Interactive Techniques
(SIGGRAPH), pp. 393–402.
Bimber, O.L., Encarnação, M., & Schmalstieg, D. (2003). The virtual showcase as a new platform for augmented reality digital storytelling. Proc.
EuroGraphics Workshop on Virtual Environments (EGVE), pp. 87–95.
Buechley, L., Eisenberg, M., Catchen, J., & Crockett, A. (2008). The LilyPad
arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education. Proc. 26th Conf. Human
Factors in Computing Systems (SIGCHI), pp. 423–432.
Camarata, K., Do, E.Y.-L., Johnson, B.R., & Gross, M.D. (2002). Navigational blocks: navigating information space with tangible media. Proc.
7th Int. Conf. Intelligent User Interfaces (IUI), pp. 31–38.
223
Dourish, P. (2001). Where the Action Is: The Foundations of Embodied Interaction. Cambridge, MA: MIT Press.
Fitzmaurice, G.W., Ishii, H., & Buxton, W. (1995). Bricks: laying the foundations for graspable user interfaces. Proc. 13th Conf. Human Factors in
Computing Systems (SIGCHI), pp. 442–449.
Frazer, J.H., Frazer, J.M., & Frazer, P.A. (1980). New developments in intelligent modeling. Proc. Computer Graphics 80, pp. 139–154.
Frens, J.W. (2006). A rich user interface for a digital camera. Personal and
Ubiquitous Computing 10(2–3), 177–180.
Goldstein, S.C., Campbell, J.D., & Mowry, T.C. (2005). Programmable
matter. IEEE Computer 38(6), 99–101.
Gershenfeld, N. (1999). When Things Start to Think. New York: Henry Holt
& Co.
Gershenfeld, N. (2005). FAB: The Coming Revolution on Your Desktop—
From Personal Computers to Personal Fabrication. New York: Basic
Books.
Ishii, H., & Ullmer, B., (1997). Tangible bits: towards seamless interfaces between people, bits and atoms. Proc. 15th Conf. Human Factors in Computing Systems (SIGCHI), pp. 22–27.
Jordà, S., Geiger, G., Alonso, M., & Kaltenbrunner, M. (2007). The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. Proc. First Int. Conf. Tangible and Embedded Interaction (TEI), pp. 139–146. Accessed at http://mtg.upf.edu/reactable/
Matsumoto, T., Horiguchi, D., Nakashima, S., & Okude, N. (2006). Z-Agon:
mobile multi-display browser cube. Proc. Extended Abstracts Human
Factors in Computing Systems (SIGCHI), pp. 351–356.
Shneiderman, B. (1983). Direct manipulation: a step beyond programming
languages. IEEE Computer 16(8), 57–69.
Sutphen, S., Sharlin, E., Watson, B., & Frazer, J. (2000). Reviving a tangible
interface affording 3D spatial interaction. Proc. 11th Western Canadian
Computer Graphics Symp., pp. 155–166.
Watanabe, R., Itoh, Y., Asai, M., Kitamura, Y., Kishino, F., & Kikuchi, H.
(2004). The soul of ActiveCube—implementing a flexible, multimodal,
three-dimensional spatial tangible interface. Proc. Int. Conf. Advances
in Computer Entertainment Technology (ACE), pp. 173–180. Accessed
at http://www-human.ist.osaka-u.ac.jp/ActiveCube/
Weiser, M. (1991). The computer for the twenty-first century. Scientific
American 265(3), 94–104.
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
Ellen Yi-Luen Do is an Associate Professor of design computing and design cognition at Georgia Institute of Technology, with joint appointments in the College of Architecture
and the College of Computing. She is also an affiliate faculty
member at the Health Systems Institute, GVU Center, and the
Center for Music Technology. Dr. Do is interested in building
better tools for people, from understanding the human intelligence involved in the design process to the improvement of
the interface with computers. Her research explores new interaction modalities as well as the physical and virtual worlds
that push the current boundaries of computing environments
for design.
339
Mark D. Gross is a Professor of computational design in the
School of Architecture at Carnegie Mellon University. Dr.
Gross has worked on constraint programming environments
for design, pen-based interaction, sketch and diagram recognition, tangible and embedded interfaces, and computationally enhanced design construction kits.
353
340
341
342
343
344
345
346
347
348
349
350
351
352
354
355
356
357
358
359
299
360
300
361
301
362
302
363
303
364
304
365
305
366
Download