User Interface Description Languages for Next Generation User Interfaces Sunday, April 6

advertisement
CHI 2008 Workshop Proceedings
Sunday, April 6th 2008
User Interface Description Languages for
Next Generation User Interfaces
Orit Shaer
Robert J.K Jacob
Mark Green
Kris Luyten
Tufts University
Tufts University
University of Ontario Institute of Technology
Hasselt University and
Transnationale Universiteit Limburg
www.cs.tufts.edu/~oshaer/workshop/
1
User Interface Description Languages
for Next Generation User Interfaces
Orit Shaer
Abstract
Computer Science Department
In recent years HCI researchers have developed a
broad range of new interfaces that diverge from the
"window, icon, menu, pointing device" (WIMP)
paradigm, employing a variety of novel interaction
techniques and devices. Developers of these next
generation user interfaces face challenges that are
currently not addressed by state of the art user
interface software tools. As part of the user interface
software community’s effort to address these
challenges, the concept of a User Interface Description
Language (UIDL), reemerge as a promising approach.
To date, the UIDL research area has demonstrated
extensive development, mainly targeting multi-platform
and multi-modal user interfaces. However, many open
questions remain regarding the usefulness and
effectiveness of UIDLs in supporting the development of
next generation interfaces.
The aim of this workshop is to bring together both
developers of next generation user interfaces and UIDL
researchers in an effort to identify key challenges
facing this community, to jointly develop new
approaches aimed at solving these challenges and
finally to consider future spaces for UIDL research.
Tufts University
Medford, MA 02155 USA
oshaer@cs.tufts.edu
Robert J.K. Jacob
Computer Science Department
Tufts University
Medford, MA 02155 USA
jacob@cs.tufts.edu
Mark Green
University of Ontario
Institute of Technology
Ontario, Canada
Mark.green@uoit.ca
Kris Luyten
Expertise Centre for Digital Media
Hasselt University and
transnationale Universiteit Limburg
Diepenbeek, Belgium
kris.luyten@uhasselt.be
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
author4@yetanotherco.com
ACM 1-xxxxxxxxxxxxxxxxxx.
Keywords
User Interface Description Language (UIDL), User
Interface Management System (UIMS). Next
Generation User interfaces.
2
ACM Classification Keywords
H5.m. Information interfaces and presentation
Motivation
In the last decade, new classes of devices for accessing
information have emerged along with an increased
connectivity. In parallel to the proliferation of these
devices, new interaction styles have been explored.
Among these new styles are virtual reality, mixed
reality, 3D interaction, tangible user interfaces,
context-aware interfaces and recognition-based
interfaces. As a result of this increasing diversity of
devices and interaction styles, developers of nextgeneration interfaces experience difficulties such as the
lack of appropriate interaction abstractions, the need to
create different design variations of a single user
interface and the integration of novel hardware. As part
of the user interface software research community
effort to address these difficulties, the concept of UIDL,
which has its foundations in user interface management
systems and model-based authoring, has reemerged as
a promising approach. UIDLs allow user interface
designers to specify a user interface, using high-level
constructs, which abstract away implementation
details. UIDL specifications can then be automatically or
semi automatically converted into concrete user
interfaces or user interface implementations. Several
UIDLs,, mostly using XML as the general language,
have been developed in recent years in order to
simplify the development of next generation interfaces.
However, despite the advancements demonstrated by
the UIDL research community (see workshop on
developing user interfaces with XML at AVI 2004 [6],
and the adoption of this approach in commercial-level
applications), many questions regarding the usefulness
and effectiveness of UIDLs for next generation user
interfaces remain open: What models are required for
specifying the dynamic behavior of next generation
interfaces that are characterized by continuous,
physical and multi-user interactions? How can UIDLs
be made understandable and useful to user interface
developers from different disciplinary backgrounds?
How should UIDL’s be evaluated? What UIDL
approaches will result in powerful design and run time
services? And finally how the result of collaboration
between user interface developers and UIDL
researchers will affect the UI architectural framework of
the next generation of user interfaces?
Workshop Goals
The first objective of this workshop is to reach a
common understanding of the UIDL approach, its
potential and shortcomings. The second objective is to
identify a set of common challenges that impact
emerging and future UIDL research by understanding
the perspectives of both user interface developers from
different disciplines and UIDL researchers. During the
workshop, user-interface developers and UIDL
researchers will work together in teams. Each team will
collaborate around an emerging interaction style,
leveraging the members various perspectives, with the
goal of forming requirements for a UIDL that support
this interaction style, and proposing a solution that
satisfies these requirements. The strengths and
weaknesses of the various solutions will then be
compared. Together, this research community will
identify common challenges and propose new concepts
to solve them. Our last objective is to consider future
spaces for UIDL research. This will help the UIDL
research community to focus its attention on
supporting the CHI community in its effort to develop
the next generation of user interfaces as well as
3
recognize opportunities for collaboration.
Participants and Expected Community
Interest
A key goal of this workshop is to foster collaboration
between developers of the next generation of user
interfaces and user interface software researchers. In
particular, the workshop will welcome both participants
working in areas such as: virtual and augmented
reality, ubiquitous pervasive and handheld interaction,
as well as tangible user interfaces, and participants that
are or were involved in an effort to develop, use and
evaluate UIDLs.
A number of workshops were held in recent years on
topics relevant to sub groups of this community: an AVI
2004 workshop on XML-based User Interface
Description Languages [6]. A CHI 2005 workshop, The
Future of User Interface Design Tools [9] and finally a
CHI 2006 workshop, What is the Next Generation of
Human Computer Interaction? [3]. We believe that the
time is ripe to connect researchers from these areas in
order to identify key challenges facing this community
at the large, to jointly develop new approaches aimed
at solving these challenges and consider future spaces
for UIDL research.
Background
Historical Roots
In the early 1980’s, the concept of a user interface
management system (UIMS) was an important focus
area for the then-forming user interface software
research community [8]. A UIMS allows designers to
specify interactive behavior in a high-level user
interface description language (UIDL) that abstracts the
details of input and output devices. This specification
would be automatically translated into an executable
program or interpreted at run time to generate a
standard implementation of the user interface. The
choice of a UIDL model and methods is a key ingredient
in the design and implementation of a UIMS. The goal
of user interface management systems was not only to
simplify the development of user interfaces but also to
promote consistency across applications as well as the
separation of user interface code from application logic.
However, the standardization of user interface elements
in the late 1980’s on the desktop paradigm made the
need for abstractions from input and output devices
mostly unnecessary. In addition, user interface
developers were seeking control of the user interface
look and feel. Thus, although a promising concept, the
UIMS approach has been challenged in practice [8].
Subsequently, in the last decade, as a result of the
proliferation of new devices and interaction techniques,
some of the challenges facing the developers on next
generation user interfaces are similar to those that
faced GUI developers in the early 1980’s. Thus, as part
of the user interface software research community
effort to address these difficulties, the concept of UIDL
reemerged as a promising approach.
Emerging UIDLs
Several UIDLs have been developed in recent years.
Most of them are XML-based. As described in [6] the
goals of these emerging UIDLs are:
•
•
•
•
To capture the requirements for a user
interface as an abstract definition that remains
stable across a variety of platforms.
To enable the creation of a single user interface
design for multiple devices and platforms.
To improve the reusability of a user interface.
To support evolution, extensibility and
adaptability of a user interface.
4
To enable automated generation of user
interface code.
[3]
To date, we have witnessed an extensive development
of UIDLs and frameworks that address the development
of user interfaces for multiple platforms, contexts and
user profiles. Examples include Plastic User Interfaces
[12], UIML[1], XIML[10], UsiXML[5] and the TERESA
XML [7]. However, only few UIDLs currently address
the development of next generation user interfaces,
supporting interaction styles such as virtual reality
(VR), mixed reality, ambient intelligence and tangible
user interfaces (TUIs): InTML [2] describes VR
applications in a platform-independent and toolkitindependent manner. PMIW [4] describes the structure
of non-WIMP user interfaces while directly capturing
continuous relationships. TUIML [11], draws upon the
PMIW approach, and aims at supporting the
development of TUIs while explicitly describing
continuous and parallel interactions.
[4]
•
In this workshop we aim to harness the potential
demonstrated by UIDL research area in supporting the
development of multi-platform and multi-modal
interfaces to address the challenges facing the
developers of the next generation of user interfaces.
[5]
[6]
[7]
[8]
[9]
[10]
References
[1]
[2]
Ali, M.F., Perez-Quinones, M.A. and Abrams, M.
Building Multi-Platform User Interfaces with UIML.
in A., S. and H., J. eds. Multiple User Interfaces,
John Wiley and Sons, UK, 2004, 95-116.
Figueroa, P., Green, M. and Hoover, H.J. InTml: a
description language for VR applications The
seventh international conference on 3D Web
technology, Tempe, Arizona, 2002.
[11]
[12]
Jacob, R.J.K. What is the Next Generation of
Human-Computer Interaction? ACM Press,
Workshop abstract, 2006.
Jacob, R.J.K., Deligiannidis, L. and Morrison, S. A
Software Model and Specification Language for
Non-WIMP User Interfaces ACM Transactions on
Computer-Human Interaction, 1999, 1-46.
Limbourg, Q., Vanderdonckt, J., Michotte, B.,
Bouillon, L. and Lopez, V. UsiXML: a Language
Supporting Multi-Path Development of User
Interfaces 9th IFIP Working Conf. on Engineering
for Human-Computer Interaction jointly with 11th
Int. Workshop on Design, Specification, and
Verification of Interactive Systems EHCIDSVIS'2004 Hamburg, Germany, 2004.
Luyten, K., Abrams, M., Vanderdonckt, J. and
Limbourg, Q. Developing User Interfaces with
XML: Advances on User Interface Description
Languages Advanced Visual Interfaces 2004,
Galipoli, Italy, 2004.
Mori, G., Paternò, F. and Santoro, C. Design and
Development of Multidevice User Interfaces
through MultipleLogical Descriptions IEEE
Transactions on Software Engineering 507-520.
Myers, B., Hudson, S.E. and Pausch, R. Past,
Present, and Future of User Interface Software
Tools ACM Transactions on Computer-Human
Interaction, 2000, 3-28.
Olsen, D.R. and Klemmer, S.R. The Future of User
Interface Design Tools CHI200 5: ACM
Conference on Human Factors in Computing
Systems, Portland, Oregon, 2005.
Puerta, A. and Eisenstein, J. XIML: A Common
Representation for Interaction Data IUI2002:
Sixth International Conference on Intelligent User
Interfaces, 2002.
Shaer, O. and Jacob, R.J.K., A Visual Language for
Programming Reality-Based Interaction. in IEEE
Symposium on Visual Languages and HumanCentric Computing, Doctoral Consortium, 2006.
Thevenin, D., Coutaz, J. and Calvary, G. A
Reference Framework for the Development of
Plastic User Interfaces. in Multiple User Interfaces,
John Wiley & Sons, , UK, 2004, 29-49.
Workshop Papers
A. Frameworks and Paradigms
Gaelle Calvary, Joel Coutaz, Lionel Balme,
Alexandre Demeure, Jean-Sebastien Sottet
Université Joseph Fourier
The Many Faces of Plastic User Interfaces
Clemens Klokmose†, Michel Beaudouin-Lafon‡
†University of Aarhus, ‡LRI - Univ. Paris-Sud
From Applications to Ubiquitous Instrumental Interaction
Thomas Pederson†‡, Antonio Piccinno‡, Dipak
Surie†, Carmelo Ardito‡, Nicholas Caporusso‡,
Lars-Erik Janlert†
†Umea University, ‡Universita degli Studi di Bari
Framing the Next-Generation 'Desktop' using Proximity and HumanPerception
Erik Stolterman†,Youn-kyung Lim‡
†Indiana University, ‡Korea Advanced Institute of
Science and Technology
A Model of Interaction
B. Requirements and Considerations for Future UIDLs
Alexander Behring, Andreas Petter, Felix
Flentge, Max Mühlhäuser
TU Darmstadt
Towards Multi-Level Dialogue Refinement for User Interfaces
Alexandre Demeure, Gaelle Calvary
University of Grenoble
Requirements and Models for Next Generation UI Languages
Jeff Dicker, Bill Cowan
University of Waterloo
Platforms for Interface Evolution
Michael Horn
Tufts University
Passive Tangibles and Considerations for User Interface Description
Languages
C. UIDLs for Multimodal and Ubiquitous Computing
Mir Ali, Dale Russell, Kibum Kim, Zhuli Xie
Motorola Labs
Dynamic User Interface Creation based on Device Descriptions
Cristian Bogdan†‡, Hermann Kaindl‡, Jürgen
Falb‡
†Royal Institute of Technology (KTH), ‡Vienna
University of Technology
Discourse-based Interaction Design for Multi-modal User Interfaces
Ladry Jeran Francois, Philippe Palanque,
Sandra Basnyat, Eric Barboni, David
Navarre
IRIT University Paul Sabatier
Dealing with Reliability and Evolvability in Description Techniques
for Next Generation User Interfaces
Bruno Dumas, Denis Lalanne, Rolf Ingold
University of Fribourg
Prototyping Multimodal Interfaces with the SMUIML Modeling
Language
Jair Leite, Antonio Cosme
UFRN
XSED: notations to describe status- event ubiquitous computing
systems
Fabio Paterno, Carmen Santoro
ISTI-CNR
UIDLs for Ubiquitous Environments
D. UIDLs for Task Analysis and Virtual Environments
Gerrit Meixner, Nancy Thiels
University of Kaiserslautern
Tool Support for Task Analysis
Volker Paelke
Leibniz Universitaet Hannover, IKG
Spatial Content Models and UIDLs for Mixed Reality Systems
Chris Raymaekers, Lode Vanacken, Joan De
Boeck, Karin Coninx
Hasselt University
High-Level Descriptions for Multimodal Interaction in Virtual
Environments
Chadwick Wingrave
Virginia Tech
Chasm: A Tiered Developer-Inspired 3D Interface Representation
Workshop Participants
Orit Shaer
(Organizer)
Tufts University
oshaer (at) cs.tufts.edu
Robert Jacob
(Organizer)
Tufts University
jacob (at) cs.tufts.edu
Kris Luyten
(Organizer)
Hasselt University and transnationale Universiteit Limburg
kris.luyten (at) uhasselt.be
Mark Green
(Organizer)
University of Ontario Institute of Technology
mark.green (at) uoit.ca
Mir Ali
Motorola Labs
farooqalim (at) gmail.com
Lionel Balme
Université de Grenoble
lionel.balme (at) imag.fr
Michel Beaudouin-Lafon
Univ. Paris-Sud
mbl (at) lri.fr
Alexander Behring
TU Darmstadt
behring (at) tk.informatik.tu-darmstadt.de
Cristian Bogdan
Royal Institute Of Technology (KTH)
cristi (at) nada.kth.se
Gaelle Calvary
Université Joseph Fourier, Grenoble
gaelle.calvary (at) imag.fr
Nicholas Caporusso
Università di Bari
ncaporusso (at) gmail.com
Joelle Coutaz
Universite Joseph Fourier, Grenoble
joelle.coutaz (at) imag.fr
Alexandre Demeure
Hasselt University
alexandre.demeure (at) uhasselt.be
Jeff Dicker
University of Waterloo, Canada
jadicker (at) cs.uwaterloo.ca
Bruno Dumas
University of Fribourg
bruno.dumas (at) unifr.ch
Juergen Falb
Vienna University of Technology
falb (at) ict.tuwien.ac.at
Ladry Jeran Francois
IRIT
dgef213 (at) gmail.com
Clemens Klokmose
University of Aarhus
clemens (at) daimi.au.dk
Michael Horn
Tufts University
michael.horn (at) tufts.edu
Denis Lalanne
University of Fribourg
denis.lalanne (at) unifr.ch
Jair Leite
UFRN
jaircleite (at) gmail.com
Youn-kyung Lim
KAIST
younlim (at) gmail.com
Gerrit Meixner
University of Kaiserslautern
meixner (at) mv.uni-kl.de
Jeffrey Nichols
(Program Committee)
IBM Almaden Research Center
jwnichols (at) us.ibm.com
Volker Paelke
Leibniz Universitaet Hannover, IKG
volker.paelke (at) ikg.uni-hannover.de
Fabio Paternò
(Program Committee)
ISTI-CNR
fabio.paterno (at) isti.cnr.it
Thomas Pederson
Umea University, Sweden
top (at) cs.umu.se
Erik Stolterman
Indiana University
estolter (at) indiana.edu
Chris Raymaekers
Hasselt University
chris.raymaekers (at) uhasselt.be
Chadwick Wingrave
Virginia Tech
cwingrav (at) vt.edu
Jamie Zigelbaum
(Guest)
MIT Media Lab
zig (at) media.mit.edu
A. Frameworks and Paradigms
Gaelle Calvary, Joel Coutaz, Lionel Balme,
Alexandre Demeure, Jean-Sebastien Sottet
Université Joseph Fourier
The Many Faces of Plastic User Interfaces
Clemens Klokmose†, Michel BeaudouinLafon‡
†University of Aarhus, ‡LRI - Univ. Paris-Sud
From Applications to Ubiquitous Instrumental Interaction
Thomas Pederson†‡, Antonio Piccinno‡,
Dipak Surie†, Carmelo Ardito‡, Nicholas
Caporusso‡, Lars-Erik Janlert†
†Umea University, ‡Universita degli Studi di
Bari
Framing the Next-Generation 'Desktop' using Proximity and HumanPerception
Erik Stolterman†,Youn-kyung Lim‡
†Indiana University, ‡Korea Advanced Institute
of Science and Technology
A Model of Interaction
The Many Faces of Plastic User
Interfaces
Gaëlle Calvary
Lionel Balme
Université Joseph Fourier
Université Joseph Fourier
Grenoble Informatics Lab.
Grenoble Informatics Lab.
BP 53
BP 53
38041 Grenoble Cedex 9,
38041 Grenoble Cedex 9, France
France
lionel.balme@imag.fr
gaelle.calvary@imag.fr
Alexandre Demeure
Joëlle Coutaz
Université Joseph Fourier
Université Joseph Fourier
Grenoble Informatics Lab.
Grenoble Informatics Lab.
BP 53
BP 53
38041 Grenoble Cedex 9, France
38041 Grenoble Cedex 9,
alexandre.demeure@imag.fr
France
joelle.coutaz@imag.fr
Jean-Sebastien Sottet
Université Joseph Fourier
Grenoble Informatics Lab.
BP 53
Abstract
In this paper, we discuss the problem of UI adaptation
to the context of use. To address this problem, we
propose to mix declarative languages as promoted in
Model Driven Engineering (MDE) with a “code-centric”
approach where pieces of code are encapsulated as
service-oriented components (SOA), all of this within a
unified software framework that blurs the distinction
between the development stage and the runtime
phase. By doing so, we support UI adaptation where
conventional WIMP parts of a user interface can be
(re)generated from declarative descriptions at the level
of abstraction decided by the designer, and linked
dynamically with hand-coded parts that correspond to
the post-WIMP portions of the UI whose interaction
nuances are too complex to be expressed with a UIDL.
We have experienced different methods for mixing MDE
with SOA at multiple levels of granularity.
38041 Grenoble Cedex 9, France
jean-sebastien.sottet@imag.fr
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
Keywords
Plastic User Interface. User Interface adaptation,
context-sensitive user interface, Model Driven
Engineering, Service Oriented Architecture, User
Interface Description Language.
2
ACM Classification Keywords
spaces1 simultaneously such that a particular UI may
be a mix of, say, Tcl/Tk, Swing, and XUL. And all of
this, should be deployed dynamically under the
appropriate human control by the way of a meta-UI
[4].
H5.2. [Information interfaces and presentation (e.g.,
HCI)]: User Interfaces – User Interface Management
Systems (UIMS).
Introduction
Observations
With the move to ubiquitous computing, it is
increasingly important that user interfaces (UI) be
adaptive or adaptable to the context of use (user,
platform, physical and social environment) while
preserving human-centered values [3]. We call this “UI
plasticity”. From the software perspective, UI plasticity
goes far beyond UI portability and UI translation.
As discussed in [4], the problem space of plastic UI is
complex: clearly, it covers UI re-molding, which
consists in reshaping all (or parts) of a particular UI to
fit the constraints imposed by the context of use. It
also includes UI re-distribution (i.e. migration) of all (or
parts) of a UI across the resources that are currently
available in the interactive space. UI plasticity may
affect all of the levels of abstraction of an interactive
system, from the cosmetic surface level rearrangements to deep re-organizations at the functional
core and task levels. When appropriate, UI re-molding
may be concerned by all aspects of the CARE
properties, from synergistic-complementary
multimodality (as in “put-that there”) and post-WIMP
UI’s, to mono-modal GUI. Re-molding and redistribution should be able to operate at any level of
granularity from the interactor level to the whole UI
while guaranteeing state recovery at the user’s action
level. Because we are living in a highly heterogeneous
world, we need to support multiple technological
Our approach to the problem of UI plasticity is based on
the following observations:
(1) The software engineering community of HCI has
developed a refinement process that now serves as a
reference model for many tools and methods: from a
task model, an abstract UI (AUI) is derived, and from
there, the Concrete UI (CUI) and the Final UI (FUI) are
produced for a particular targeted context of use. The
process is sound but cannot cope with ambient
computing where task arrangement may be highly
opportunistic and unpredictable.
(2) Software tools and mechanisms tend to make a
dichotomy between the development stage and the
runtime phase making it difficult to articulate run-time
adaptation based on semantically rich design-time
descriptions. In particular, the links between the FUI
and its original task model are lost. As a result, it is
very hard to re-mold a particular UI beyond the
cosmetic surface.
(3) Pure automatic UI generation is appropriate for
simple (not to say simplistic, “fast-food”) UI’s. As
1
“A technological space is a working context with a set of
associated concepts, body of knowledge, tools, required skills,
and possibilities.” [5]
3
mentioned above, the nuances imposed by high-quality
multi-modal UI’s and post-WIMP UI’s, call for powerful
specification whose complexity might be as high as
programming the FUI directly with the appropriate
toolkit. In addition, conventional UI generation tools
are based on a single target toolkit. As a result, they
are unable to cross multiple technological spaces.
(4) Software adaptation has been addressed using
many approaches over the years, including Machine
Learning, Model-Driven Engineering (MDE), and
service-oriented components. These paradigms have
been developed in isolation and without paying
attention to UI-specific requirements. Typically, a
“sloppy” dynamic reconfiguration at the middleware
level is good enough if it preserves system autonomy.
It is not “observable” to the end-user whereas UI remolding is! Thus, UI re-molding adds extra constraints
such as making explicit the transition between the
source and the target UI’s so that, according to
Norman, the end-user can evaluate the new state.
Based on these observations, we propose the following
key principles that we have put into practice using a
combination of MDE and SOA. The exploration of
Machine Learning is under way.
Three Principles for UI Plasticity
Principle #1: Close-adaptiveness must cooperate with
open-adaptiveness. By design, an interactive system
has an “innate domain of plasticity”: it is close-adaptive
for the set of contexts of use for which this
system/component can adapt on its own. In ubiquitous
computing, unplanned contexts of use are unavoidable,
forcing the system to go beyond its domain of
plasticity. Then the interactive system must be open-
adaptive so that a tier infrastructure can take over the
adaptation process. The functional decomposition of
such an infrastructure is described in [1].
Principle #2: An interactive system is a set of graphs of
models that express different aspects of the system at
multiple levels of abstraction. These models are related
by mappings and transformations, which in turn, are
models as well. As a result, an interactive system is not
limited to a set of linked pieces of code. The models
developed at design-time, which convey high-level
design decision, are still available at runtime for
performing rationale deep adaptation. In addition,
considering transformations and mappings as models is
proving very effective for controlling the adaptation
process [6].
Principle #3: By analogy with the slinky meta-model of
Arch, increasing principle #1 allows you to decrease
principle #2 and vice-versa. At one extreme, the
interactive system may exist as one single task model
linked to one single AUI graph, linked to a single CUI
graph, etc. This application of Principle#1 does not
indeed leave much flexibility to cope with unpredictable
situations unless it relies completely on a tier
infrastructure that can modify any of these models on
the fly, then trigger the appropriate transformations to
update the Final UI. This is the approach we have
adopted for MARI [7]. In its current implementation,
MARI provides a reasonable answer to Observations
#1, #2, and #3: (a) the “fast-food” UI portions are
generated from a task model. The corresponding AUI
and CUI are automatically generated and maintained by
MARI : they are not imposed on the developer; (b) In
addition, hand-coded service-oriented components can
be dynamically plugged and mapped to sub-tasks
4
whose UI cannot be generated by transformations.
MARI has been applied to develop a photos browser
that includes an augmented multi-touch table.
Alternatively, the various perspectives of the system
(task models, AUI, FUI, context model, etc.) as well as
the adaptation mechanisms may be distributed across
distinct UI service-oriented components, each one
covering a small task grain that can be run in different
contexts of use. We have adopted this approach to
implement the Comet toolkit [2]. Basically, a Comet is
a plastic micro-interactive system whose architecture
pushes forward the separation of concerns advocated
by PAC and MVC. The functional coverage of a comet is
left open (from a plastic widget such as a control panel,
to a complete system such as a powerpoint-like slide
viewer). Each Comet embeds its own task model, its
own adaptation algorithm, as well as multiple CUI’s and
FUI’s, each one adapted to a particular context of use.
FUI’s are hand-coded possibly using different toolkits to
satisfy our requirements for fine-grained
personalization and heterogeneity (Observation #3).
From the infrastructure point of view, a Comet is a
service that can be discovered, deployed and integrated
dynamically into the configuration that constitutes an
interactive environment.
Conclusion
The community has a good understanding about the
nature of the meta-models for describing the high-level
aspects of plastic interactive systems (e.g., task and
domain-dependent concepts). Blurring the distinction
between the design and the runtime phases provides
humans (e.g., designers, installers, end-users) with full
potential for flexibility and control. On the other hand,
we still fall short at describing fine-grained post-WIMP
multimodal interaction and at supporting situations that
could not be predicted at design time. For these cases,
we claim that hand-coding and a tier service-oriented
infrastructure are unavoidable. This is what we have
done with MARI and the Comets, two very different
ways of applying our principles for UI plasticity.
Acknowledgements
This work has been partly supported by Project EMODE
(ITEA-if4046) and the NoE SIMILAR- FP6-507609.
References
[1] Balme, L., Demeure, A., Barralon, N., Coutaz, J.,
Calvary, G.: CAMELEON-RT: A Software Architecture
Reference Model for Distributed, Migratable, and Plastic
User Interfaces. In Proc. EUSAI 2004, LNCS Vol. 3295,
Springer-Verlag (Publ.) (2004), 291-302.
[2] Calvary, G. Coutaz, J. Dâassi, O., Balme, L.
Demeure, A. Towards a new generation of widgets for
supporting software plasticity: the “comet”. In Proc. of
Joint EHCI-DSV-IS (2004).
[3] Calvary, G., Coutaz, J., Thevenin, D. A Unifying
Reference Framework for the Development of Plastic
User Interfaces. In Proc. Engineering HCI’01, Springer
Publ., LNCS 2254 (2001), 173-192
[4] Coutaz, J. Meta-User Interface for Ambient Spaces.
In Proc. TAMODIA’06, Springer LNCS (2006), 1-15.
[5] Kurtev, I., Bézivin, J. & Aksit, M. Technological
Spaces: an Initial Appraisal. In Proc. CoopIS, DOA'2002
Federated Conferences, Industrial track, Irvine (2002).
[6] Myers, B., Hudson, S.E. & Pausch, R.: Past,
Present, and Future of User Interface Software Tools.
TOCHI, ACM Publ., Vol 7(1), (2000) 3-28
[7] Sottet, J.S., Calvary, G., Coutaz, J., Favre, J.M. A
Model-Driven Engineering Approach for the Usability of
Plastic User Interfaces. In Proc. Engineering Interactive
Systems 2007, DSV-IS 2007), Springer.
From Applications to
Ubiquitous Instrumental Interaction
Clemens Nylandsted Klokmose
Michel Beaudouin-Lafon
Department of Computer Science,
LRI,
University of Aarhus
Univ. Paris-Sud
Åbogade 34, 8200 Århus N,
Bât 490 - Orsay, F-91405,
Denmark
France
clemens@daimi.au.dk
mbl@lri.fr
Introduction
Ubiquitous interaction – interaction in pervasive,
ubiquitous, tangible or ambient computing
environments, including interaction with multiple,
dynamic, and distributed interfaces – is an area where
there currently is limited coherent theory to guide
design. Unlike the desktop metaphor for office work,
there is as yet no well-established metaphorical
approach or conceptual model to guide the design of
ubiquitous interaction. Ubiquitous interaction challenges
the traditional assumptions of one device / one
interface / one user in many ways: multiple users can
interact with multiple devices through a variety of
interfaces, including interfaces spanning multiples
devices. The WIMP1 paradigm and the desktop
metaphor do not scale to these new situations.
Abstract
This paper shows the limitations of the current
application-centric approach to user interfaces when
considering interaction in Ubicomp environments, and
presents an alternative paradigm based on
instrumental interaction. The paper then discusses the
elements of a description language for ubiquitous
instrumental interaction and outlines some challenges
related to the sharing and distribution of instruments.
Keywords
HCI, ubiquitous computing, instrumental interaction,
description languages
In addition, the software tools used for creating
interfaces are tightly bound to the platform hosting
them and to the WIMP interaction style. These tools
typically do not support the multiplicity, dynamism,
heterogeneity and distribution that characterize
Ubiquitous Interaction, making it particularly difficult to
develop interfaces for Ubicomp environments.
ACM Classification Keywords
H.5 Information interfaces and presentation
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
1
Windows, Icons, Menus and Pointing
2
An Instrumental Paradigm
One of the central goals in creating interfaces for
Ubicomp environment is to support fluid interaction in
distributed interfaces and interaction in dynamic
configurations of interfaces distributed across
stationary and mobile devices. Two major challenges
that we want to address are: How to support the reuse
and the quality of learning across different devices [4],
and how to technically support the continuity and
distribution of work across multiple devices.
We believe that one approach to address this problem
is to deconstruct applications rather than simply create
scaled-down versions of PC applications to run on, e.g.,
Personal Digital Assistants (PDAs). Scaling down
applications leads to many problems: the tool has to be
implemented on each device, an exact copy of
functionality is hard to achieve, and the user has to
learn to use the alternate implementation. Moreover
the user must rely on the vendor to actually support
the tools he needs.
The notion of application is indeed very artificial, with
no direct equivalent in the real world. While
applications can be seen as collections of tools
dedicated to a certain task in the physical world, such
as the architect’s or the painter’s tools, applications
lack the dynamics of such collections. A painter can
freely add or remove brushes from his collection, pass
them around, etc., but a brush in a drawing application
can rarely be removed and used in another context.
Instead, applications typically have a predefined set of
tools that is difficult or impossible to reconfigure to
adapt to one’s taste. This lack of flexibility limits the
mobility, distribution and customizability of interfaces.
It also typically results in large and complex
applications, built for general-purpose personal
computers, with dozens and sometimes hundreds of
tools to cover all possible needs. Such applications are
not suitable for smaller devices, or devices with
different kinds of inputs. This in turn creates the need
for specific versions of the applications for different
devices, which might be radically different across
different platforms and technologies.
By contrast, the physical tools we use in the real world
may have limited properties, but they can be put to
unlimited uses and are usually easy to understand and
adapt to ones’ need. For example, almost any surface,
not just paper, affords to be written on with a pen.
Collections of physical tools can assist in achieving
complex goals, and they can be split or recomposed to
address specific situations. For example, the architect
can bring the drawing, a pencil and an eraser with him
to a meeting. He can use the same pencil from the
drawing table to write his grocery list, or he can use the
pen with which he signed the check at the restaurant to
draw a sketch on a paper napkin. In other words the
tools are not restricted to a specific domain, they can
be used across domains – and across people.
Letting the nature of the physical world’s relationships
between tools and objects be the ultimate goal would
be naïve and overly idealistic. Nevertheless the kind of
implicit relationships between us, our tools and the
objects we interact with, which Gibson [5] calls
affordances, can be approximated through the concept
of Instrumental Interaction [2]. In this model, the
coupling between tools and objects is not type-based,
e.g. a given application for a given filetype, instead it is
property-based, mapping one or more instruments to
each object property or set of properties.
3
Grace is a graphics designer in a
small advertising bureau, and is
presenting a poster she has
developed on her workstation for
a client at the client’s office.
When seeing the poster on print
the client asks if it would be
possible to create flyers matching
the design. Grace asks for a
minute and while the client is
watching she rearranges, trims
and scales the poster into a flyer
on her PDA using a subset of the
tools, which she had used on the
workstation and brought with her.
The client is satisfied and Grace
returns to her office workstation
to give the finishing touches to
the flyer.
Textbox 1. Interaction Scenario
Beaudouin-Lafon [2] introduced Instrumental
Interaction to model WIMP and post-WIMP interfaces.
The key idea is a conceptual separation between
physical instruments (the input devices), logical
instruments and domain objects. Logical instruments
are software mediators or two-way transducers
between the user and domain objects. The user acts on
the instrument, which transforms the user's actions into
commands that affect the relevant target domain
objects and provide feedback to the instrument.
Inspired by this separation between instrument and
domain object and by the notion that interaction is
mediated by instruments, we suggest extending
Instrumental Interaction to a general paradigm for
ubiquitous and distributed interaction. We suggest an
approach where functionality is implemented by
instruments with limited properties, decoupled from the
specific types of domain objects they operate on. The
implicit relations between the properties of the object
and the instruments operationalize, in a sense, Gibson’s
affordances. As a result, an instrument can operate on
an object even if it has not been explicitly designed for
it. For example, an object that has the property of
being a 2D surface can be used by any instrument that
operates on a 2D surface.
We further suggest to decouple the physical
instruments – the input devices – from the logical
instruments. This decoupling is necessary to support
distribution of instruments or tools across devices. The
logical instrument is not bound to a specific set of input
devices but rather to any input device that has the
properties described by the logical instrument, such as
providing 2D input vs. text input.
To give an example of the kind of interaction we
imagine, consider the simple scenario in textbox 1 to
the left. The scenario illustrates the fluidity of
interaction when using instrumental interaction in an
Ubicomp environment.
The decomposition of user interfaces into simpler
commands is not new. Jacob [6] describes, in one of
the initial attempts at creating UIDLs for modern direct
manipulation interfaces, interaction in direct
manipulation systems as composed by a set of simple
dialogues. Beaudouin-Lafon [2] emphasizes mediation
rather than direct engagement with the system,
inspired by activity theory [3]. A mediated relationship
between the user and the objects of interest, in
contrast to dialog-based interaction, is particularly
attractive for ubiquitous interaction since the notion of
system or computer – the other participant in the
dialogue – becomes blurry and changing. The notion of
mediation is also useful when dealing with detached
instruments that act on different objects in different
contexts.
Descriptive Challenges in Realization
The dynamic and distributed nature of ubiquitous
interaction requires both descriptions and state of
components to be easily shared among devices. Hence
high-level description languages seem to be an
appropriate approach.
We now give an overview of the descriptive challenges
in implementing the ubiquitous instrumental interaction
that we have outlined. The challenges can roughly be
divided into object, need/capability mapping and
state/behavior descriptions.
4
The components that need descriptions are the objects
of interest, the instruments, the devices (resources and
i/o), and the representations of the objects or views.
First, a language is needed to describe the object
model (general or domain specific). This is a fairly
common and well-understood problem.
being developed in the first author’s lab in order to
experiment with the concept of instrumental interaction
in an Ubicomp environment. We will also use this
prototype to assess the requirements for tools and
languages to develop such interfaces.
Acknowledgements
Second, we need to describe (logical) instruments:
their input needs, e.g., text input vs. 2D coordinates,
and their capabilities for manipulating certain properties
of the domain objects. The needs of the instruments
will have to match the capabilities of the input devices,
e.g. a mouse for 2D coordinates. The behavior of the
instrument could be described using state machines as
in Jacob’s dialogue descriptions [6] or ala SwingStates
[1]. An interesting challenge is to support distributed
interactions, e.g., a paintbrush instrument with the
canvas on one device and the palette on another, which
requires sharing or distributing the state machines
among the interested parties, in a dynamic way.
We thank Susanne Bødker, Olav Bertelsen and Pär-Ola
Zander for valuable input and discussions and give
acknowledgements to Rasmus Berlin for being the
critical programmer.
Third, we need to describe views, taking into account
the possible migration of objects among surfaces
managed by different devices. Finally, on top of the
description of the components, a language is required
to describe the various mapping between instruments
and input devices, between instruments and objects
and between views and output devices.
[3] Bertelsen, O. W. and Bødker, S. (2003). HCI
Models, Theories, and Frameworks: Toward an
Interdisciplinary Science, chapter Activity Theory.
Morgan Kaufman Publishers.
Conclusion
This article has given a brief overview of ubiquitous
instrumental interaction and has outlined the elements
of a description language. We have illustrated some
non-trivial challenges arising from migrating and
sharing instruments across devices. A first prototype is
Citations
[1] Appert, C. and Beaudouin-Lafon, M. 2006.
SwingStates: Adding State Machines to the Swing
Toolkit. In Proceedings of ACM symposium on User
Interface Software and Technology (Montreux, Suisse,
October 16 - 18, 2006). UIST'06. ACM Press, New York,
NY, April 2006, pages 319-322.
[2] Beaudouin-Lafon, M. (2000). Instrumental
interaction: an interaction model for designing postWIMP user interfaces. CHI 2000. ACM Press.
[4] Brodersen, C., Bødker, S. and Klokmose, C. N.
(2007). Quality of Learning in Ubiquitous Interaction.
European Conference on Cognitive Ergonomics 2007
(ECCE 2007). Covent Garden, London, UK, 28-31
August, 2007
[5] Gibson, J. J. (1979). The Ecological Approach to
Visual Perception. Lawrence Erlbaum Associates, New
Jersey, USA.
[6] Jacob, R. J. (1986). A specification language for
direct-manipulation user interfaces. ACM Trans. Graph.
5, 4 (Oct. 1986), 283-317.
!
!"#$%&'()*+(,+-)./+&+"#)%0&(12+34)056(
73%&'(8"0-%$%)9(#&:(;7$#&(8+"<+5)%0&
!
!
/0%1.2*3-&-$2%"456*
V
*
H#,.A!&4!7&+,-./'0!T*/#'*#!
<.$1-=%*7$&)#%*
!"#$%&'(#)%"*
H/,A!2/!8'4&$+(./*(!
"#$%&'()!*&+,-./'01!('2!.3#$#4&$#!5-+('67&+,-.#$!
8'.#$(*./&'!9578:1!/%!;#*&+/'0!(!%#(+)#%%)<!/'.#0$(.#2!
,($.!&4!#=#$<2(<!(*./=/.<!2&>'!.&!.3#!,&/'.!>3#$#!
?*&+,-./'0@!/%!/'%#,($(;)#!4$&+!?(*./=/.<@A!B!+&2#))/'0!
,$&;)#+!&**-$%!/'!.3#%#!#+#$0/'0!+&;/)#!('2!
-;/C-/.&-%!*&+,-./'0!%/.-(./&'%!;#*(-%#!/.!/%!3($2!.&!
2#.#$+/'#!.3#!%,(./()!('2!&,#$(./&'()!)/+/.%!&4!('!
&'0&/'0!(*./=/.<1!4&$!.3#!3-+('!,#$4&$+/'0!.3#!(*./=/.<1!
4&$!.3#!*&+,-.#$!%<%.#+!+&'/.&$/'0!('2D&$!%-,,&$./'0!
/.1!(%!>#))!(%!4&$!.3#!+&2#))#$!&;%#$=/'0!/.A!B)%&1!/.!/%!('!
&,#'!/%%-#!3&>!;#%.!.&!+&2#)!.3#!*(-%()!$#)(./&'%!
;#.>##'!,3<%/*()!9$#()!>&$)2:!('2!=/$.-()!92/0/.()!>&$)2:!
,3#'&+#'(!.3(.!.3#%#!?/'.#))/0#'.!#'=/$&'+#'.%@!*('!
;#!,$&0$(++#2!.&!+(/'.(/'1!>3#.3#$!2#4/'#2!;<!
%&4.>($#!#'0/'##$%!&$!.3#!#'26-%#$%!.3#+%#)=#%A!E#!
,$&,&%#!(!+&2#)/'0!4$(+#>&$F!.3(.!(2$#%%#%!.3#!(;&=#!
+#'./&'#2!/%%-#%!('2!,$#%#'.!&-$!/'/./()!(..#+,.%!.&!
*$#(.#!(!G%#$!8'.#$4(*#!H#%*$/,./&'!I('0-(0#!9G8HI:!
;(%#2!&'!.3#!4$(+#>&$FA!
G+#X!G'/=#$%/.<!
G'/=#$%/.[!2#0)/!T.-2/!2/!\($/!!
TKYNVPO!G+#X!
ONVM]!\($/!
T>#2#'!
8.()<!
.&,Z*%A-+-A%#!
($2/.&Z2/A-'/;(A/.!
M
!
H/,A!2/!8'4&$+(./*(!
G'/=#$%/.[!2#0)/!T.-2/!2/!\($/!!
>)(0%=.2*<.9%$'22%*
ONVM]!\($/!
H/,A!2/!8'4&$+(./*(!
8.()<!
G'/=#$%/.[!2#0)/!T.-2/!2/!\($/!!
,#2#$%&'Z2/A-'/;(A/.!
ONVM]!\($/!
!
8.()<!
7"#%")%*3)(()""%*
*(,&$-%%&Z2/A-'/;(A/.!
H/,A!2/!8'4&$+(./*(!
!
G'/=#$%/.[!2#0)/!T.-2/!2/!\($/!!
?.$2@+$):*A."=-$#**
ONVM]!\($/!
H#,.A!&4!7&+,-./'0!T*/#'*#!
8.()<!
G+#X!G'/=#$%/.<!
,/**/''&Z2/A-'/;(A/.!
TKYNVPO!G+#X!
!
T>#2#'!
8)9.:*;'$)-*
*
)#LZ*%A-+-A%#!
H#,.A!&4!7&+,-./'0!T*/#'*#!
G+#X!G'/=#$%/.<!
+,%(-"#$)(*!"#-$.(#)%"*
TKYNVPO!G+#X!
J3#!K0&*#'.$/*!8'.#$(*./&'!,#$%,#*./=#!&'!578!3(%!/.%!
;#0/''/'0%!/'!&;%#$=/'0!.3#!/'.#$(*./&'!,&%%/;/)/./#%!('2!
)/+/.(./&'%!,$#%#'.!($&-'2!(!%,#*/4/*!3-+('!;&2<!('2!
+/'2!/++#$%#2!/'!(!>&$)2!4-))!&4!,3<%/*()!9$#()6>&$)2:!
('2!=/$.-()!9*&+,-.(./&'():!&;L#*.%!&4!,&.#'./()!
7&,<$/03.!/%!3#)2!;<!.3#!(-.3&$D&>'#$9%:A!
T>#2#'!
758!MNNO1!B,$/)!MP!Q!R(<!S1!MNNO1!T('!U&%#1!GTB!
2/,(FZ*%A-+-A%#!
B7R!V6WWWWWWWWWWWWWWWWWWA
!
!
!
/'.#$#%.A!8',-.!('2!&-.,-.!2#=/*#%!($#1!L-%.!)/F#!.3#!
*&+,-.#$!3($2>($#!/.%#)41!*&+,)#.#)<!/0'&$#2!/'!.3/%!
*&'*#,.-()!4$(+#>&$F!%/+/)($!.&!3&>!.3#!*3(/$!-%#2!;<!
(!.<,/*()!2#%F.&,!"7!-%#$!/%!/0'&$#2!/'!*)(%%/*()!578!
+&2#)%A!J3/%!2#)/;#$(.#!2/%$#0($2!4&$!/'.#$(*./&'!
/'4$(%.$-*.-$#!())&>%!.3#!+&2#)/'0!&4!,3<%/*()!('2!
=/$.-()!&;L#*.%!(%!/4!.3#<!*&6#W/%.#2!/'!.3#!%(+#!
K-*)/2#('!%,(*#A!E#!>&-)2!($0-#1!>/.3&-.!,$&&41!.3(.!
.3/%!+/03.!;#!=#$<!*)&%#!.&!3&>!2&+(/'!#W,#$.%!%##!
.3#/$!*-$$#'.!#'=/$&'+#'.!/'!.3#!%/.-(./&'!>3#'!'&!
-'#W,#*.#2!;$#(F2&>'%!9.#*3'&)&0/*()!&$!&.3#$:!&**-$!
^_`A!E3/)#!$#%#($*3!/'!/+,$&=/'0!('2!#=&)=/'0!*)(%%/*()!
/',-.!('2!&-.,-.!/'4$(%.$-*.-$#!.&!4/.!.3#!'#>!*&+,-./'0!
(,,)/*(./&'%!%-$#)<!3(%!/.%!,)(*#1!>#!;#)/#=#!.3(.!
0$#(.#$!(2=('*#!*('!;#!+(2#!;<!$#.3/'F/'0!.3#!$&)#!&4!
,#$%&'()!*&+,-./'0!/.%#)4A!
!"#$%&#'()*+,$-*.$/#0)12#3#.4)5*3$6'#.$73)#.-48#'9$
J3#!%/.-(./=#!%,(*#!+&2#)!9B),'$-*4:!/%!;(%#2!&'!.3#!
,3<%/*()6=/$.-()!2#%/0'!,#$%,#*./=#!;$/#4)<!&-.)/'#2!
(;&=#1!+#('/'0!.3(.!,3<%/*()!('2!=/$.-()!2&+(/'!
&;L#*.%!($#!.$#(.#2!(%!;#/'0!)&*(.#2!/'!.3#!%(+#!%,(*#A!
J3#!+&2#)!/%!4&$!.3#!#+#$0/'0!K0&*#'.$/*!8'.#$(*./&'!
,($(2/0+!>3(.!.3#!=/$.-()!2#%F.&,!/%!4&$!.3#!"7DE8R"!
/'.#$(*./&'!,($(2/0+a!+&$#!&$!)#%%!#=#$<.3/'0!&4!
/'.#$#%.!.&!(!%,#*/4/*!3-+('!(*.&$!/%!(%%-+#2!.&1!('2!
%-,,&%#2!.&1!3(,,#'!3#$#A!
B),'$-*4C!B!%/.-(./=#!%,(*#!
+&2#)!^M`A!
:3$;04<+=#$*-$:++=>53?$)"#$@*A#=$)*$43$:8)*.B'$
C5)D4)5*3E$
84!(!0)(%%!&4!L-/*#!/%!/'!.3#!$/03.!3('2!&4!(!%,#*/4/*!
3-+('!(*.&$1!('2!('!#+(/)!2/%,)(<#2!&'!(!*#))-)($!
,3&'#!/'!.3#!)#4.!3('2!9$#(2<!.&!=/#>:1!;&.3!&;L#*.%!
>&-)2!;#!*&'%/2#$#2!.&!$#%/2#!/'!.3#!&;L#*.!
+('/,-)(./&'!%,(*#!/'!B),'$-*4A!B!,(,#$!'#>%,(,#$!&'!
.3#!.(;)#!L-%.!/'!4$&'.1!('2!.3#!F#<%!/'!.3#!%(+#!
M!
,#$%&'%!,&*F#.!>&-)2!;#!+&2#))#2!(%!/'%/2#!.3#!
+('/,-)(;)#!%,(*#!;-.!&-.%/2#!.3#!&;L#*.!+('/,-)(./&'!
%,(*#A!B!,(/'./'0!&'!.3#!&,,&%/.#!%/2#!&4!.3#!.(;)#!9;-.!
'&.!.3#!&'#!;#3/'2!.3#!(*.&$b%!;(*F:!>&-)2!;#!/'!.3#!
&;%#$=(;)#!%,(*#A!c/'())<1!())!/'!,$/'*/,)#!,#$*#/=(;)#!
&;L#*.%!/'!.3#!,3<%/*()6=/$.-()!>&$)2!>3/*3!(.!)#(%.!4&$!
.3#!+&+#'.!2&!'&.!3(,,#'!.&!;#!,#$*#/=(;)#!;<!.3#!
%,#*/4/*!3-+('!(*.&$!($#!$#0($2#2!(%!%/.-(.#2!/'!.3#!
>&$)2!%,(*#1!#'*&+,(%%/'0!.3#!%,(*#%!+#'./&'#2!
#($)/#$A!
8'!.3#!,3<%/*()!>&$)21!.3#!&-.#$!;&$2#$!&4!.3#!
+('/,-)(;)#!%,(*#!*('!;#!(,,$&W/+(.#2!('2!2#%*$/;#2!
/'!K-*)/2#('!.#$+%a!+('/,-)(;)#!.3/'0%!($#!.<,/*())<!
*)&%#$!.3('!.3/'0%!.3(.!($#!&;%#$=(;)#!;-.!'&.!
(+#'(;)#!.&!+('/,-)(./&'A!J3/%!%,(./()!$#)(./&'%3/,!/%!
$#4)#*.#2!/'!B),'$-*4A!H#.#$+/'/'0!(!*&$$#%,&'2/'0!
;&$2#$!/'!.3#!=/$.-()!>&$)2!/%!%&+#>3(.!+&$#!*&+,)#W!
('2!2#,#'2%!&'!%/.-(./=#!(**#%%!.&!/',-.!('2!&-.,-.!
2#=/*#%A!H-#!.&!.3#!'(.-$#!&4!.3#!(,,)/*(./&'!($#(!
.&>($2%!>3/*3!&-$!*-$$#'.!%<%.#+!2#=#)&,+#'.!#44&$.%!
($#!.($0#.#21!>#!3(=#!*3&%#'!.&!.#+,&$($/)<!%-%,#'2!
.3#!>&$F!&'!/'=#%./0(./'0!3&>!&;L#*.!+('/,-)(./&'!('2!
'(=/0(./&'!%3&-)2!;#!;#%.!+&2#))#2!/'!=/$.-()!
#'=/$&'+#'.%!9#A0A!&4!.3#!E8R"!F/'2:!.&!4/.!.3#!
%/.-(./=#!%,(*#!+&2#)A!5&>#=#$1!#W,#$/#'*#%!4$&+!(!
4/$%.!(..#+,.!^V`!3(%!*&'=/'*#2!-%!.3(.!/.!%3&-)2!;#!
,&%%/;)#A!
6'53?$)"#$@*A#=$-*.$2D5A53?$6F5GD5)*D'$H*<+D)53?$
:++=584)5*3$&#'5?3E$
E#!*&'%/2#$!.3#!;&$2#$%!&4!.3#!&;%#$=(;)#!%,(*#!.&!
2#4/'#!.3#!%#.!&4!&;L#*.%!.3(.!*('!,&%%/;)<!;#!,($.!&4!(!
-;/C-/.&-%!*&+,-./'0!?(,,)/*(./&'@!(.!('<!0/=#'!./+#!
4&$!(!%,#*/4/*!3-+('!(*.&$A!84!(!*&+,-./'0!%<%.#+!
2/%,)(<%!/'4&$+(./&'!&-.%/2#!.3#!&;%#$=(;)#!%,(*#1!/.!
S!
!
>/))!'&.!;#!'&./*#2A!I/F#>/%#1!/4!(**#%%!.&!(!2#%/$#2!
=/$.-()!&;L#*.!/%!,$&=/2#2!.3$&-03!('!/',-.!2#=/*#!
*-$$#'.)<!&-.%/2#!&4!.3#!+('/,-)(;)#!%,(*#1!.3#!3-+('!
(*.&$!/%!4&$*#2!.&!*3('0#!,3<%/*()!)&*(./&'A!B%!/.!
3(,,#'%1!.3/%!=/#>!()/0'%!>#))!>/.3!.3#!E8R"D2/$#*.!
+('/,-)(./&'!,($(2/0+!4&$!=/$.-()6>&$)2!/'.#$(*./&'!
>3#$#!%-**#%%4-)!(,,)/*(./&'!2#%/0'!(%!>#))!(%!-%#!=#$<!
+-*3!2#,#'2%!&'!F##,/'0!.3#!$/03.!&;L#*.%!?&'!%*$##'@!
(.!.3#!$/03.!./+#!^S`A!
B),'$-*6a!B'!(*./=/.<6(>($#!>#($(;)#!
*&+,-./'0!($*3/.#*.-$#!/'*)-2/'0!('!
K0&*#'.$/*!8'.#$(*./&'!R('(0#$!^M`A!
B),'$-*Da!B!"3<%/*()6d/$.-()!B$.#4(*.!
*&'%/%./'0!&4!4&-$!,3<%/*()!+('/4#%.(./&'%!
%3&>'!.&!.3#!)#4.!9#A0A!4&-$!,$/'.&-.%!&4!
.3/%!2&*-+#'.:!('2!4&-$!=/$.-()!
+('/4#%.(./&'%!%3&>'!.&!.3#!$/03.!9#A0A!
2/0/.()!=#$%/&'%!&4!.3/%!2&*-+#'.!/'!
2/44#$#'.!4&$+(.%!('2D&$!$#%/2/'0!(.!
2/44#$#'.!,)(*#%!/'!7<;#$%,(*#:!^V`A!
!"#$;?*8#3).58$73)#.48)5*3$@434?#.$
E/.3/'!.3#!*&'.#W.!&4!2#=#)&,/'0!('!(*./=/.<6(>($#!
0#'#$()6,-$,&%#!>#($(;)#!*&+,-./'0!($*3/.#*.-$#!
9B),'$-*6:1!>#!3(=#!%.($.#2!.&!2#=#)&,!(!*&+,-./'0!
*&+,&'#'.!>3/*3!>/))!3(=#!(%!(!.(%F!.&!F##,!'#*#%%($<!
/'.#$(*./&'!$#%&-$*#%!>/.3/'!.3#!+('/,-)(;)#!%,(*#!9/'!
.3#!*(%#!&4!/',-.:!('2!>/.3/'!.3#!&;%#$=(;)#!%,(*#!9/'!
.3#!*(%#!&4!=/%-()!&-.,-.:A!\#*(-%#!&4!)/+/.#2!(*.-(./&'!
,&%%/;/)/./#%!/'!.3#!,3<%/*()!>&$)21!.3#!*&+,&'#'.!>/))!(.!
./+#%!$#)<!&'!(%%/%.('*#!4$&+!.3#!3-+('!(*.&$1!%-*3!(%!
.&!%3/4.!,&%/./&'!.&!4(*#!(!=/%-()!2/%,)(<1!&$!.&!,/*F!-,!(!
2/%,)(<6#C-/,,#2!2#=/*#!4$&+!.3#!,&*F#.1!/'!.3#!*(%#!
>3#'!/+,&$.('.!=/%-()!/'4&$+(./&'!3(%!.&!;#!
*&++-'/*(.#2!.&!.3#!(*.&$A!
I">'584=1J5.)D4=$:.)#-48)'$
8'!.3#!G;/C-/.&-%!7&+,-./'0!)/.#$(.-$#1!/.!/%!&4.#'!
,$#2/*.#2!.3(.!&-$!#=#$<2(<!#'=/$&'+#'.%!>/))!;#*&+#!
+&$#!?/'.#))/0#'.@!/'!.3#!%#'%#!.3(.!('!/'*$#(%/'0!
(+&-'.!&4!(*./&'%!;#)&'0/'0!.&!%,#*/4/*!3/03#$6)#=#)!
,#$%&'()!3-+('!(*./=/./#%!>/))!;#!(-.&+(.#2!;<!
*&+,-.#$!%<%.#+%!.3(.!+&$#!&$!)#%%!4$#C-#'.)<!('2!
(-.&'&+&-%)<!/'.#$=#'#!.&!.3#!;#'#4/.!&4!.3#!3-+('!
,#$4&$+/'0!.3#!(*./=/.<A!c&$!.3#!,-$,&%#!&4!+&2#)/'0!
,$#6,$&0$(++#2!&$!#'26-%#$!,$&0$(++#2!*(-%()!
$#)(./&'%!;#.>##'!,3<%/*()!('2!=/$.-()!&;L#*.%!9%-*3!(%!
.3#!#44#*.!&4!+&=/'0!('!($$&>!&'!(!=/$.-()!2#%F.&,!(%!(!
*&'%#C-#'*#!&4!+&=/'0!('!#$0&'&+/*())<!%3(,#2!&;L#*.!
&'!(!,3<%/*()!2#%F.&,1!/A#A!.3#!#%%#'./()!4-'*./&'()/.<!&4!
(!*&+,-.#$!+&-%#:1!>#!3(=#!/'.$&2-*#2!.3#!*&'*#,.!&4!
"3<%/*()6d/$.-()!B$.#4(*.!9"dB:a!
:$+">'584=1K5.)D4=$4.)#-48)$5'$43$4F').48)$4.)#-48)$)"4)$
LMN$5'$<435-#')#A$53$F*)"$)"#$+">'584=$43A$)"#$K5.)D4=$
#3K5.*3<#3)O$P"#.#$LQN$)"#'#$<435-#')4)5*3'$)*$4$=4.?#$
#0)#3)$D)5=5'#$)"#$D35GD#$4--*.A438#'$43A$8*3').453)'$
L/*.<43O$MRSSN$)"4)$)"#$)P*$A5--#.#3)$#3K5.*3<#3)'$
-485=5)4)#O$43A$-534==>$LTN$P"#.#$*3#$<435-#')4)5*3$*-$4$
'+#85-58$+">'584=1K5.)D4=$4.)#-48)$5'$#4'5=>$5A#3)5-5#A$5-$4$
8*..#'+*3A53?$<435-#')4)5*3$53$)"#$*)"#.$#3K5.*3<#3)$
5'$(3*P3E$UMV$
J3/%!2#4/'/./&'!+(<!;#!$#)(W#2!(!;/.!.&!/'*)-2#!()%&!
($.#4(*.%!.3(.!($#!+('/4#%.#2!/'!L-%.!&'#!&4!.3#!.>&!
>&$)2%!9#A0A!(!3(++#$!/'!.3#!,3<%/*()!>&$)21!&$!(!>#;!
,(0#!/'!.3#!=/$.-()!>&$)2:1!(%!>#))!(%!($.#4(*.%!.3(.!3(=#!
+&$#!.3('!&'#!+('/4#%.(./&'!/'!&'#!>&$)2!9(%!
/))-%.$(.#2!/'!B),'$-*D:A!J3#!$#%-).1!>#!;#)/#=#1!/%!(!
0#'#$()/%(./&'!&4!>3(.!/%!.3#!'&./&'!&4!('!&;L#*.!&4!
/'.#$#%.!4&$!(!%,#*/4/*!3-+('!(*.&$!9%&+#./+#%!()%&!
$#4#$$#2!.&!(%!?2&+(/'!&;L#*.@:!.3(.!)#'2%!/.%#)4!>#))!.&!
+&2#))/'0!.3#!+(L&$/.<!&4!&;L#*.%!.3(.!(,,#($!/'!.3#!
%/.-(./=#!%,(*#!+&2#)!2-$/'0!.3#!*&-$%#!&4!('<!%,#*/4/*!
(*./=/.<A!
/%E.$&2*.*F2-$*!"#-$B.(-*8-2($)9#)%"*
?.",'.,-*
8'!.3#!)(%.!<#($%1!+('<!#44&$.%!3(=#!;##'!+(2#!.&!
2#=#)&,!eRI!)('0-(0#%!#W,$#%%/'0!/+,)#+#'.(./&'%!&4!
-%#$!/'.#$4(*#%!9G8%:A!E3/)#!%&+#!&4!.3#%#!3(=#!(!ES7!
&$/0/'!9#A0A!5JRIf!e5JRIf!Tdg1!#.*A:1!&.3#$%!($#!
,$&,&%#2!;<!=($/&-%!$#%#($*3!0$&-,%!.&!;#..#$!3('2)#!
_!
!
%/.-(./&'%!>3#'!.3#!.($0#.!2#=/*#%!($#!=#$<!
3#.#$&0#'#&-%1!*&'%/%./'0!&4!(!=($/#.<!&4!2#=/*#%!%-*3!
(%!2#%F.&,!*&+,-.#$%1!"HB%1!('2!*#))-)($!,3&'#%A!
h#=#$.3#)#%%1!+&%.!&4!.3#%#!G8HI%!($#!(;)#!.&!2#%*$/;#!
E8R"6)/F#!G8%!&')<A!
!!"#$%&'()*"#
###$$#
###!+,-*#.(,$*&'()*/#
######0123#4*5()4678696:;<=>?@66
666666666A(B*696:C(.(65"#68DBBE*-@6
6666666664F'*696:,""G@/6
#########!1HF-E)($I(AE5*-4(4E"A"#
6666666666660B(4*#E($/6'('*#60JB(4*#E($/6
6666666666660)".*#KF'*/6H(#%60J)".*#KF'*/6
6666666666660B(AE'D$(4E"AKF'*/6)(A6,*6"'*A*%6
6666666666666660JB(AE'D$(4E"AKF'*/6
############$$$#
#########!%1HF-E)($I(AE5*-4(4E"A"#
#########!2E#4D($I(AE5*-4(4E"A"#
66666666666605E$*KF'*/618L60J5E$*KF'*/6
66666666666605E$*M(B*/6
6666666666666663'"N*"6C(.(6;""GO'%56
6666666666660J5E$*M(B*/6
6666666666660(%%#*--/6PQR3A4"AE"R60J(%%#*--/6
############$$$#
#########!%2E#4D($I(AE5*-4(4E"A"#
######0J123#4*5()4/6
######$$#
######!I(AE'D$(,$*&'()*"#
#########$$#
#########0+,S*)4I(AE'D$(4E"A&'()*"#
############0123#4*5()4678696:3TUV?@66
666666666666666A(B*696:IF6+55E)*6I"AE4"#@6
6666666666666664F'*696:.E-D($6%E-'$(F@/6
###############!1HF-E)($I(AE5*-4(4E"A"#
6666666666666666660,#(A%/6PWK60J,#(A%/6
6666666666666666660%E-'$(FKF'*/6PXK60J%E-'$(F4F'*/6
6666666666666666660B(YX*-"$D4E"A/66
666666666666666666666TZU?Y<[\66
6666666666666666660JB(YX*-"$D4E"A/6
666666666666666666666]]]6
###############!%1HF-E)($I(AE5*-4(4E"A"#
############0J123#4*5()4/6
############0123#4*5()4678696:!>?V[@66
666666666666666A(B*696:^""N$*6I('-@6
6666666666666664F'*696:_*,6'(N*@/6
###############!2E#4D($I(AE5*-4(4E"A"#
6666666666666666660`Xa/66
666666666666666666666H44'QJJB('-ON""N$*OE4J66
6666666666666666660J`Xa/6
##################$$$#
###############!%2E#4D($I(AE5*-4(4E"A"#
############0J123#4*5()4/6
############$$#
#########0J+,S*)4I(AE'D$(4E"A&'()*"#
######!%I(AE'D$(,$*&'()*"#
###!+,-*#.($&'()*/#
!%!"#$%&'()*"#
\<!*&+;/'/'0!.3#!*&'*#,.!&4!"dB!>/.3!.3#!%/.-(./=#!
%,(*#!+&2#)1!('2!*$#(./'0!(!%-/.(;)#!4&$+()!)('0-(0#1!
>#!%3&-)2!;#!(;)#!.&!2#%*$/;#!.3#!,3<%/*()!('2!=/$.-()!
$#%&-$*#%!(*.-())<!-%#2!>/.3/'!.3#!*&-$%#!&4!('!(*./=/.<!
9;<!&;%#$=/'0!>3(.!($.#4(*.!+('/4#%.(./&'%!.3(.!;#*&+#!
+('/,-)(.#2!('2!&;%#$=#2:V1!(%!>#))!(%!.&!%,#*/4<!.3#!
$#C-/$#+#'.%!4&$!'#>!?-%#$!/'.#$4(*#%@!/'.#'2#2!.&!
%-,,&$.!#'=/%/&'#2!&$!#W/%./'0!,3<%/*()6=/$.-()!
(*./=/./#%1!/A#A!(*./=/./#%!$#C-/$/'0!4$#C-#'.!%>/.*3/'0!
;#.>##'!(*./&'%!/'!.3#!,3<%/*()!('2!/'!.3#!=/$.-()!
>&$)2%A!B!0&&2!$#,$#%#'.(./&'!&4!(=(/)(;)#!/'.#$(*./&'!
$#%&-$*#%!>&-)2!()%&!;#!.3#!;(%/%!4&$!.3#!&,#$(./&'!&4!
.3#!,$#=/&-%)<!+#'./&'#2!K0&*#'.$/*!8'.#$(*./&'!
R('(0#$A!B!4/$%.!%F#.*3!&4!%-*3!(!+($F-,!)('0-(0#!
$#,$#%#'.(./&'!/%!%3&>'!/'!B),'$-*GA!
E#!($#!*-$$#'.)<!>&$F/'0!&'!.3#!2#4/'/./&'!&4!.3#!
,$&,#$./#%!4&$!#(*3!"dB$.#4(*.!('2!&'!.3#!eRI!T*3#+(!
*&'.(/'/'0!.3#!$-)#%!.&!>3/*3!#(*3!($.#4(*.!.<,#!3(%!.&!
*&+,)<!.&A!h&.#!.3(.!.3#$#!/%!'&.3/'0!,$#=#'./'0!(!
"dB$.#4(*.!4$&+!3(=#!%#=#$()!,3<%/*()!('2D&$!=/$.-()!
+('/4#%.(./&'%!9(%!,/*.-$#2!/'!B),'$-*D:!/'!.3/%!
$#,$#%#'.(./&'A!
B),'$-*Ga!B!G8HI!#W(+,)#!.$('%*$/,.!
2#%*$/;/'0!(!%/.-(./&'!>3#$#!(!;&&Fb%!
($#!,$#%#'.!/'!.3#!&;%#$=(;)#!%,(*#1!
/%!/'%/2#!.3#!+('/,-)(;)#!%,(*#A!!!
G,!-'./)!'&>1!&')<!.3#!=/%-()!,$#%#'*#!&4!&;L#*.%!3(=#!
;##'!*&'%/2#$#2!/'!.3#!+('/,-)(;)#!('2!&;%#$=(;)#!
%,(*#%1!>3/)#!&;L#*.!+('/,-)(./&'!3(%!;##'!)/+/.#2!.&!
&;L#*.!0$(;;#2D'&.!0$(;;#2A!B,($.!4$&+!2#=#)&,/'0!.3#!
G8HI!)('0-(0#1!>#!($#!/'.#$#%.#2!/'!/'*$#(%/'0!.3#!
+&2#)/'0!,&>#$!/'!.3#!&;L#*.!+('/,-)(./&'!?%,(*#@!;<!
(22/'0!%&+#!'&./&'!&4!0#%.-$#%!>/.3!9('2!>/.3&-.j:!
0$(;;#2!&;L#*.%A!E#!($#!()%&!/'.#$#%.#2!/'!/'*$#(%/'0!
.3#!'-+;#$!&4!/'.#$(*./&'!+&2()/./#%!*&=#$#2!;<!.3#!
+&2#)!;<!%.($./'0!.&!*&'%/2#$!%&-'2!(%!(!*&+,)#+#'.!
.&!=/%/&'!4&$!2#.#$+/'/'0!.3#!;&$2#$%!&4!%,(*#%A!8'/./()!
.3#&$#./*()!/'=#%./0(./&'%!%3&>!.3(.!%&-'2!+&2#))/'0!
*&-)2!;#!2&'#!;<!(22/'0!(!*&+,)#+#'.($<!&;%#$=(;)#!
9&$!$(.3#$1!,#$*#/=(;)#:!%,(*#!('2!('!(22/./&'()!%&-'26
$#)(.#2!+('/,-)(;)#!%,(*#!.&!.3#!+&2#)A!!
M-B-$-"(-2*
!"#$ "#2#$%&'1!JA!W.*<$H*38#+)D4=$X53('$)*$H4D'4=$
Y#=4)5*3'$Z$I">'584=1J5.)D4=$:.)#-48)'$53$@50#A1Y#4=5)>$
C+48#A!"3H!.3#%/%1!G+#X!-'/=#$%/.<1!T>#2#'1!$#,&$.!
GR8hc6NSAV_1!8TTh!NS_P6N]_M1!8T\h!YV6OSN]6]]k6]A!
9MNNS:A!
!%#$ "#2#$%&'1!JA!l!T-$/#1!HA!J&>($2%!('!(*./=/.<6(>($#!
>#($(;)#!*&+,-./'0!,)(.4&$+!;(%#2!&'!('!#0&*#'.$/*!
/'.#$(*./&'!+&2#)A!8'!I.*8E$6HC$Q[[\1!Ih7T1!T,$/'0#$!
9MNNO:A!
!&#$ T3'#/2#$+('1!\A!J3#!4-.-$#!&4!/'.#$(*./=#!%<%.#+%!
('2!.3#!#+#$0#'*#!&4!2/$#*.!+('/,-)(./&'A!]#"4K5*D.$
43A$73-*.<4)5*3$!#8"3*=*?>1!V1!MSO6M]k!9VYPM:!
!'#$ T-*3+('1!IA!BA!I=43'$43A$C5)D4)#A$:8)5*3'A!
7(+;$/20#!G'/=#$%/.<!"$#%%!9VYPO:A!
;&.3!,3<%/*()!('2!=/$.-()!+('/4#%.(./&'%!
>3/)#!(!*&+,-.#$!2/%,)(<!('2!(!>#;!,(0#!
H'#'$-*I%$:C*+J#-"&)",*#0-*K%&-=*E)#0*
;%'"&*."&*L-2#'$-2*
V
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
! E#! 3(=#! /'! 4(*.! $#*#/=#2! ,$&+/%/'0! $#%-).%! ;<! -%/'0! .3#!
%/.-(./=#!%,(*#!+&2#)!4&$!.3#!,-$,&%#!&4!(*./=/.<!$#*&0'/./&'!/'!
(!F/.*3#'!#'=/$&'+#'.!%/+-)(.#2!/'!d/$.-()!i#()/.<!^]`A!
!(#$ T-$/#1!HA1!"#2#$%&'1!JA1!I(0$/44&-)1!cA1!U(')#$.1!IA1!
TLm)/#1!HAa!B*./=/.<!i#*&0'/./&'!-%/'0!('!K0&*#'.$/*!
"#$%,#*./=#!&4!K=#$<2(<!n;L#*.%A!8'!I.*8E$67H$Q[[\1!
T,$/'0#$!9MNNO:A!
A Model of Interaction
Erik Stolterman
Abstract
School of Informatics,
In this paper, we make the case that definitions and
descriptions of interactions, in most cases, are not
based on a detailed enough model of the primary
entities of human-computer interaction. Instead of
viewing interaction as simply the relationship between
a person and an artifact, we suggest an expanded
model. We argue that such a model is necessary if our
attempt is to create an interaction description language
that can cope with the full complexity of interaction
without making it too simplistic.
Indiana University
901 E. 10th St.
Bloomington, IN 47401 USA
estolter@indiana.edu
Youn-kyung Lim
Department of Industrial Design,
Korea Advanced Institute of
Science and Technology
KAIST, Gusung-dong, Yusung-gu,
Daejeon, South Korea
Keywords
younlim@gmail.com
Interaction, user, interface, experience, model
ACM Classification Keywords
H.5.2 [User Interfaces]: Interaction styles, Theory and
methods.
Introduction
In the quest to define user interface description
languages (UIDL), we need to answer the basic
question of how to think about “interaction” as such. In
this position paper we will present a model of
interaction that builds on some of our earlier work [4].
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
The most frequent words used when describing
interaction are user, artifact, interface, and interaction.
It is clear that we are dealing with the fundamental
relationship between a person and an artifact (or a
group of people and a system of artifacts). This
fundamental relationship has been defined and
2
described in many different ways by researchers and
practitioners in the field.
In this paper we will make the case that these
definitions are, in most cases, not based on a careful
analysis of the meanings of the primary entities of
human-computer interaction. Instead of viewing
interaction as the direct relationship between a person
and an artifact, we suggest a model with more detail,
hopefully without leading to disturbing complexity.
A model of interaction
Before we introduce our model, let’s look at the model
that is pervasively recognized and accepted. In this
simple model there are basically only two parts:
Interaction is understood as the way a user interacts
with an artifact. In many cases this model is sufficient
and it gives us a simple and good understanding of
what is going on and how to think about users and
artifacts. But, with new innovative and advanced
interactive technologies, and with the growing
pervasiveness of digital technology in our
environments, this may not always be a sufficient
model.
In our earlier proposed model, we separate ‘interaction’
from the ‘artifact’ part and the ‘user’ part, and put it as
a separate entity in-between those two entities:
In earlier papers, we have argued for an understanding
of interaction as its own distinct entity emerging
between a user and an interactive artifact [4]. We have
also argued that it is useful to understand any
interaction as a composition of qualities that creates a
unified whole, greater than the sum of its parts. We
have made the case that thinking about interaction in
this way invites designers to more concretely and
explicitly explore the possible interaction design space.
However, we believe that in order to support the notion
of creating a description language for interaction, this
model is still not detailed enough. The motivation of our
proposal in this paper is based on the fact that when
people try to describe interaction, it is not uncommon
that they end up mixing words and concepts that
describe distinctly different parts of the model. For
instance, it is very different to describe interaction as a
user experience or as a technical quality of the artifact.
Both descriptions are correct and both belong in an
overall interaction description language, but we argue
that it is crucial that they are related and valued in
regard to their respective part of the interaction model.
So, in order to make the model more useful we have
expanded it. Its ‘left’ side can be viewed as follows:
Such a division creates a difference between what a
user is experiencing as a result of the interaction and
what the user is actually doing. User experience as an
aspect of interaction has recently been examined and
described by many researchers [2, 5]. Less emphasis
has been devoted to the notion of user behavior,
except in relation to usability studies. The user
behavior is determined by the space of possible actions
defined by the artifact itself. The user experience of an
interaction is influenced by the behavior the artifact
3
requires or inspires, but is not determined by it. Two
users can behave in exactly the same way while having
distinctly different experiences.
In a similar way it is possible to expand the right side
of our initial model:
This division distinguishes the two different aspects of
the artifact, which create the space of possible actions
for interaction. It is clear that many of the traditional
styles of interaction are about the part we label the
artifact interface. The artifact function is about purpose,
function, and performance, all influencing the design of
an interaction. However they do not influence the
interaction in the same direct way as the artifact
interface. In interaction design, it is for instance not
unusual to make prototypes that have partial
functionality, poor performance, while the interface is
fully developed since it is the interface that is being
tested.
Using the model
The purpose of our model is that it can serve as a
foundation for different attempts aimed at developing
description languages for human-computer interaction.
Even though there is a need for an overall description
language of interaction, each part of the model could
be the focus of a more specific description language. In
our earlier work we have [4], for instance, developed a
“language” that describes the shape of an interaction,
which is the concept in the middle of the model (that
we have not discussed here).
In this earlier work [4], we have developed a number
of attributes that are unique to the shape of an
interaction and that distinguishes it from the other
parts of the model. For instance, with interaction
attributes such as state (fixed-to-changing), continuity
(discrete-to-continuous), connectivity (independent-tonetworked), and directness (direct-to-indirect), it is
possible to describe the shape of an interaction even
though it is a quite abstract entity.
We believe that many attempts in HCI have been
devoted to develop languages for different parts in the
model. For instance, McCarthy & Wright made a serious
attempt to formulate aspects of the user experience
[5]. Forlizzi & Battarbee also proposed a framework for
defining experience [2].
In a previous workshop (to this one at CHI 2006 [3])
several attempts were made which can be seen as
examples of a language describing the artifact
interface, such as virtual and augmented reality,
ubiquitous, pervasive, and handheld interaction,
tangible user interfaces, lightweight, tacit, passive, or
non-command interaction, perceptual interfaces,
context-aware interfaces, ambient interfaces, embodied
interfaces, sensing interfaces, eye-movement-based
interaction, and speech and multimodal interfaces.
In usability research, user behavior has been studied
and some attempts to form a language of user behavior
exist [1, 6].
Even though all these attempts have served their
purpose to some degree, we would argue that these
attempts, if related to a model like ours inclusive to the
various parts of the whole interaction, could lead to a
4
more comprehensive description of interaction. It would
make it possible to relate different aspects of
interaction to each other, and it could open up for new
levels of analysis and understanding of interaction.
The model also makes it possible to even further focus
on the details of just one part. For instance, the artifact
interface can be divided into sub-areas, and for each of
those it would be possible to further develop a precise
and detailed description language without the risk of
being accused of loosing the big picture. This means
that detailed sub-languages can be developed, sensitive
to fast technological developments or radical changes in
use behavior. The overall model shown below can still
be the same and function as a foundation and coherent
core for such developments.
Conclusion
It is clear that interaction is a complicated thing. The
overall argument in this paper is that, even though real
progress has been made in the examination and
description of interaction, we need an overall model of
interaction. We need a model that is simple, while at
the same time detailed enough to make various aspects
of the interaction between humans and artifacts visible
and distinct. We need such a model in order to further
develop description languages that can support and
further analysis and design of interaction. We believe
our proposed model is a first step in that direction.
Acknowledgement
We thank our colleagues and students involved in
earlier phases of this work.
References
The overall model can also function as a bridge
between different parts. The model makes it possible to
analyze how changes in one part (for instance, the
artifact interface) would or can affect another (such as
the user behavior), or even with more distance, how
the performance of the functions in the artifact relates
to the user experience of the interaction. It is,
according to the model, impossible to discuss such
relationships without discussing how the parts that
exist in-between those two parts, namely, artifact
interface, interaction, and user behavior, influence the
relationship of those two. It is in this case clear that
these “effects” has to “travel” through both the artifact
interface, the interaction, and the user behavior to
“reach” the user experience.
[1] Card, S. K., Moran, T. P., and Newell, A. The
Psychology of Human-Computer Interaction. Lawrence
Erlbaum Associates, Hillsdale, NJ, USA, 1983.
[2] Forlizzi, J. and Battarbee, K. Understanding
experience in interactive systems. Proc. of DIS 2004,
ACM Press (2004), 261-268.
[3] Jacob, R., Girouard, A., Hirshfield, L. M., Horn, M.,
Shaer, O., Solovey, E. T., and Zigelbaum, J. CHI2006:
what is the next generation of human-computer
interaction?. interactions 14, 3 (May. 2007), 53-58.
[4] Lim, Y., Stolterman, E., Jung, H., and Donaldson, J.
Interaction Gestalt and the Design of Aesthetic
Interactions. Proc. of DPPI 2007, ACM Press (2007)
239-254.
[5] McCarthy, J. and Wright, P. Technology as
Experience. MIT Press, Cambridge, MA, USA, 2004.
[6] Norman, D. A. The Psychology of Everyday Things.
Basic Books, New York, NY, USA, 1988
B. Requirements and Considerations for Future UIDLs
Alexander Behring, Andreas Petter, Felix
Flentge, Max Mühlhäuser
TU Darmstadt
Towards Multi-Level Dialogue Refinement for User Interfaces
Alexandre Demeure, Gaelle Calvary
University of Grenoble
Requirements and Models for Next Generation UI Languages
Jeff Dicker, Bill Cowan
University of Waterloo
Platforms for Interface Evolution
Michael Horn
Tufts University
Passive Tangibles and Considerations for User Interface Description
Languages
Towards Multi-Level Dialogue
Refinement for User Interfaces
Abstract
Alexander Behring
Felix Flentge
TU Darmstadt
TU Darmstadt
Informatik, FG Telecooperation
Informatik, FG Telecooperation
Hochschulstr. 10
Hochschulstr. 10
64289 Darmstadt
64289 Darmstadt
behring@tk.informatik.tu-
felix@tk.informatik.tu-
darmstadt.de
darmstadt.de
Andreas Petter
Max Mühlhäuser
TU Darmstadt
TU Darmstadt
Informatik, FG Telecooperation
Informatik, FG Telecooperation
Hochschulstr. 10
Hochschulstr. 10
ACM Classification Keywords
64289 Darmstadt
64289 Darmstadt
a_petter@tk.informatik.tu-
max@tk.informatik.tu-
darmstadt.de
darmstadt.de
D.2.2 [Software Engineering]: Design Tools and
Techniques---User interfaces; H.1.m [Models and
Principles]: Miscellaneous; H.5.2 [Information
Interfaces and Presentation (e.g., HCI)]: User
Interfaces---Theory and methods
In this paper, we present our observations regarding
modeling multimodal, context-sensitive user interfaces.
Based on these observations, we argue to use the
“Dialogue Refinement” approach to conjointly describe
behavior and layout of user interfaces.
Keywords
UI Models, Refining User Interfaces, Dialogue, User
Interface Description Language (UIDL)
Introduction
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
With the increasing number of interaction devices, and
their everyday-presence, research to facilitate the
development of ubiquitous computing applications is
gaining more and more interest. Support for building
User Interfaces (UIs) for such applications will have to
address the diversity of interaction modalities. This
comprises the multitude of devices that are available
for interaction, as well as their corresponding
2
Interaction Modality: the
interaction style, strategy
and device used.
interaction strategies and style. As interaction devices
are increasingly portable, context dependence of UIs is
another important aspect.
In this paper, we focus on these two challenges for UI
description languages (UIDL): the increasing number of
possible modalities and the adaption to various
contexts.
Current approaches
Tools supporting development of UIs have been
thoroughly analyzed in [1]. The authors conclude that
conventional GUI approaches are ill-suited for
upcoming UIDL challenges. They see model-based
approaches as a promising alternative.
AUI – Abstract User
Interface (abstract
presentation)
CUI – Concrete User
Interface (concrete
presentation)
Frameworks for declarative model-based UIs and
development environments exist, such as [3] with a
focus on GUIs and [4] for plastic (highly adaptive) UIs.
They share an understanding of a set of core models:
conceptual model, task or dialogue model, abstract
presentation (AUI) and concrete presentation (CUI)
model.
In more recent approaches, these models are used to
develop multimodal UIs. Teresa [2] is based on
ConcurTaskTrees (CTT), a task description language.
CTT models are first transformed into AUI models and
then further into CUI models. Platform specific UIs can
be produced by filtering a “master” task model in which
the elements are annotated with supported platforms.
Further extensions to CTT allow context-dependent sub
task models.
UsiXML [5], using an extended version of CTT, focuses
on the integration of a great variety of different target
platforms. It therefore supports a great set of interactor
types at CUI level. While the Teresa tool and models
are centered on the task model, UsiXML focuses on the
AUI and CUI models.
Observations
In the following, we describe observations made with
current UIDLs and draw conclusions from these. We
then propose an approach called “Dialogue Refinement”
to address the issues that have been identified.
Levels of Refinement
In recent approaches, commonly two levels of UI
Refinement exist: an AUI model is refined to a CUI
model. Context-dependence of these UI models is
mainly achieved via filtering annotated models or via
context-dependent task-models.
Model Driven Architecture (MDA) proposes a similar
approach: to refine abstract to more concrete models,
but with no restrictions on the number of refinement
steps [6]. Using this idea, an UI for a specific situation
can be refined via multiple levels. Hereby, a situation
especially includes, besides other context information,
the modality used and the target platform.
Allowing an arbitrary number of refinement levels,
helps the developer to “provide the information at the
right level of abstraction”, as advocated in [6]. In the
light of increasingly complex challenges when modeling
UIs, this gets more and more important (e.g., as
depicted in Figure 1).
!
We conclude to allow an arbitrary number of
refinement levels for UI development.
3
driven) and dialogue (behavior-driven) interfaces opens
up.
!
We conclude to conjointly refine UI layout and
behavior.
Flexibility of Interactors and Interaction Concepts
To hardcode interactors and interaction concepts into
metamodels is common among current approaches. As
the metamodel is the central interface, it therewith gets
harder to extend the set of supported interactors and
interaction concepts for a given approach. Furthermore,
Myers et al. [1] state that the currently hard to extend
interactor libraries of tools are limiting the developer.
Figure 1: An exemplary tree for AUI Refinement. The abstract
“Root UI” is refined stepwise over multiple levels for different
situations.
Conjoint Refinement of Behavior and Layout
We observed that behavioral aspects in situation
specific UIs are not formulated in a conjoint way
together with the UI layout. Task and AUI model can
convey behavior information, but behavioral and layout
information is not given at the same level of detail and
abstraction.
Dialogue Refinement
includes layout and
behaviour information.
The importance of an integrated formulation can be
illustrated by comparing a voice to a direct
manipulation (DM) graphical modality. The voice
modality heavily depends on the behavioral (temporal)
aspect, whereas the DM GUI heavily depends on the
layout. When trying to integrate the two modalities,
both dimensions have to be taken into account at the
same level. Otherwise, a gap between DM (layout-
!
We conclude not to hardcode interactors and
interaction concepts into the metamodel.
Addressing Platform Specifics
Current approaches like UsiXML and Teresa rather focus
on common aspects of UI descriptions for different
modalities than addressing the details of a specific
modality. In consequence, the developer cannot control
the low-level pragmatics of the interactions look and
feel. This contradicts the conclusion by Myers et al. [1]
that control over such pragmatics is important for the
developer.
!
We conclude to allow the developer to model
platform specific aspects.
Dialogue Refinement
We propose to address the conclusions described above
by using a graph (Dialog Refinement Graph) for UI
modeling. UI models are nodes in such a graph and can
be refined to other models (nodes). There is no limit
4
imposed on the depth of the tree. The refinement can
be driven by situation changes, which includes changes
in platform and modality (i.e. context-dependence).
UI models contain integrated information about layout
(e.g., interactors) and behavior (e.g., state charts).
Finally, UI interactor types are not hardcoded in the
metamodel, but contained in libraries. Additionally, we
advocate accompanying type information with
ontologies, describing the relations between different
types. This can be exploited to support automatic
generation of situation specific UIs. Furthermore, we
plan to investigate how to describe small variations
atop a modeled UI (“UI Nuances”) that can be reused.
In EMODE [7], we developed the idea of using a multilevel refinement graph, called “AUI Refinement” (cf.
Figure 1), and integrated it in the UI editor. The node
at the top of the tree contains the most abstract UI
description, serving as the connection to other models
(e.g., application flow). This “Root AUI” is refined
further for different situations, as depicted in the figure.
Further, an interactor ontology and transformations
were produced, supporting the developer in refining
UIs. Currently, we are extending these results of
EMODE with the behavioral aspect, as we discovered its
necessity.
Conclusion
In this paper, we introduced the idea of Dialogue
Refinement, based on our preliminary work. The
presented approach supports refinement over an
arbitrary number of abstraction levels, while allowing
the developer to still address platform specifics. Finally,
it supports classification of interactors with a supporting
ontology and conjointly refines UI layout and behavior.
Acknowledgements
We acknowledge funding of this work by the BMBF in
context of the SoKNOS and ITEA EMODE projects and
thank our colleagues for their valuable contributions.
References
[1] Myers, B., Hudson, S.E., and Pausch, R. Past,
Present, and Future of User Interface Software Tools.
ACM Transactions on Computer-Human Interaction 7, 1
(2000), 3-28.
[2] Mori, G., Paternò, F., and Santoro, C. Design and
Development of Multidevice User Interfaces through
Multiple Logical Descriptions. IEEE Trans. Softw. Eng.
30, 8 (2004), 507-520.
[3] da Silva, P.P. User Interface Declarative Models and
Development Environments: A Survey. Lecture Notes in
Computer Science 1946 (2000), 207-226.
[4] Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q.,
Souchon, N., Bouillon, L., Florins, M., and
Vanderdonckt, J. Plasticity of User Interfaces: A
Revised Reference Framework. Proc. TAMODIA '02,
INFOREC Publishing (2002), 127-134.
[5] Limbourg, Q., Vanderdonckt, J., Michotte, B.,
Bouillon, L., and Jaquero, V.L. USIXML: A Language
Supporting Multi-path Development of User Interfaces.
EHCI/DS-VIS (2004), 200-220.
[6] Koch, T., Uhl, A., and Weise, D. Model Driven
Architecture. Interactive Objects Software (2001).
Available online at
ftp://ftp.omg.org/pub/docs/ormsc/02-09-04.pdf.
[7] Dargie, W., Strunk, A., Winkler, M., Mrohs, B.,
Thakar, S., and Enkelmann, W. EMODE – a ModelBased Approach to Develop Adaptive Multimodal
Interactive Systems. Proc. ICSOFT 2007, INSTICC
Press (2007).
Requirements and models for next
generation UI languages
Abstract
Alexandre Demeure
University of Grenoble, LIG
385 rue de la bibliothèque
B.P. 53
38041 Grenoble Cedex 9 France
Alexandre.Demeure@imag.fr
Gaelle Calvary
University of Grenoble, LIG
385 rue de la bibliothèque
B.P. 53
38041 Grenoble Cedex 9 France
Gaelle.Calvary@imag.fr
In this paper we explain why concrete User Interfaces
(UI) languages are too high level to be flexible enough
for adaptation. Moreover, we note that the quality of
UIs produced by human designers is far away from
quality of automatically generated UI. Therefore we
propose to classify human designed UI in a knowledge
base inspirited by Service Oriented Approaches. This
base aims to allow designer or even automatic UI
generation algorithms to retrieve UI descriptions both
at run time and design time.
Keywords
Plastic User Interface (UI), UI Adaptation, Model Driven
Engineering, Service Oriented Architecture, UI
Description Language, Tailored UIs.
ACM Classification Keywords
H5.2. Information interfaces and presentation (e.g.,
HCI): User Interfaces – User Interface Management
Systems (UIMS).
Introduction
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
With the rise of ubiquitous computing, a lot of work has
been done about User Interface (UI) Description
Languages (UIDL). The main goal was to give
conceptors the opportunity to describe the UI at a high
2
level of abstraction and then generate automatically the
“real” UI for different platforms. CAMELEON framework
[4] explicit the relevant level of abstraction at which a
UI can be described: concepts and tasks (C&T),
Abstract UI (AUI), Concrete UI (CUI) and Final UI
(FUI).
The problem is that generation is done using WIMP
languages or WIMP toolkits (SWING, XHTML, etc.)
which result in poor quality UI and exclude any non
WIMP interaction style. The main reason for this state
of fact is probably that WIMP toolkits are de facto
standard.
The right model at the right place
While task languages are well formalized, providing
neither more nor less than information needed, the
same can not be said about current CUI languages. One
of the problems with these languages is that they
enumerate “classical” WIMP widgets. These widgets
combine implicitly several level of abstraction. For
example, a button can be described at a CUI level (a
clickable box with a text inside) as at a task level (it
support the “activation” task). This state of fact come
from the time when there were no task models used to
model the UI. It is no more true and leads to two main
drawbacks: each widget is associated with a particular
task and symmetrically each task only has a finite and
frozen set of possible presentations. This association is
the result of years of practice. It is valuable within the
WIMP assertion (one user, one screen, one keyboard,
one mouse …) but may be inadequate in other context.
The same remark can be done about how the widget is
rendered and how it is possible to interact with it. The
way to interact with a widget is most of the time hardcoded in the widget itself, disabling the possibility for
the user to use other interaction paradigms such as
gesture or vocal recognition.
MDE approaches [3] show us that each level of
abstraction should be described using an appropriate
language. These languages should only describe
information related to their abstraction level (C&T, AUI,
CUI, FUI) and should not introduce artificial
dependencies with other abstraction level. Instead,
mappings should be used to express relationships
between descriptions of different level of abstraction.
In practice, these remarks apply to CUI languages or
toolkits. However, some works separate rendering from
interaction [1],[2] and tend to overcome classical
widgets [4]. The trend seems to be to propose, at least
for GUI, sets of drawing primitives organized in a scene
graph, “widget” are reconstruct as an assemblies of
primitives nodes [[7],[4]]. These approaches allow
designers to produce more easily nice looking UIs and
avoid closing the possible presentations of a given task.
Other approaches [8] tend to propose arbitrary abstract
widgets that can be mapped on the fly to concrete and
possibly non standard ones, thus enhancing diversity of
presentations.
Generated versus tailored UIs
If we consider quality of use, tailored UI (UI designed
by a human) are far away from what a program can
automatically generate (Figure 1). These can be
explained by the fact that, for now, computers know
almost nothing about notions such as “beauty”. Figure
1-B show some WinAMP skins, we can see that
compared to Figure 1-A, a lot graphical artefact are
added that do not correspond to any functional
requirement of an audio player (light effect, rounded
3
forms, buttons layouts, etc.). These artefacts are
produce by human designer to improve the quality of
use and are for now impossible to be automatically
generated.
descriptions or implementations should be a key point
for UI generation (automatic, semi-automatic or
manual).
Service brocker
This leads us to the following conclusion: “if we can not
do it by ourselves and there is no hope to it in a close
future, why not using human pre-build, tailored, UI?”.
In SOA, a service can be achieved by combination of
smaller services done by other peoples. By symmetry,
the UI of a “big” system could be composed by smaller
UI tailored by designers for a particular context.
Integrating these tailored UIs automatically would
provide great benefits provided that it will be possible
to have a processable description of them.
A)
B)
Figure 1. A) UI of an audio player generated with SUPPLE
system [6]. B) Two UI of the WinAMP audio player.
The problem is now to know where these tailored UIs
can be found.
Capitalizing knowledge
Capitalizing and giving access to services is a key point
of SOA. To achieve it, SOA use service brokers (Figure
2). The same way, capitalizing and giving access to UIs
1) Register a
service
2) Ask for a service
by giving a
description of it.
3) Service
access
Service Provider
Client
4) Use
service
Figure 2. Global view of SOA.
The difference is that services descriptions are mainly
functional, while we need UI descriptions. Some works,
like in [5], explore the possibility of using a semantic
network to classify and retrieve UI models or
implementations both at design and run-time. Each
node corresponds to the model of an interactive
system. This model can be done at any level of
abstraction (C&T, AUI, CUI, FUI). Each edge of this
network corresponds to a relationship that can be
established between two models. Classical relationships
are inheritance, specialisation, extension, restriction,
composition, etc. The structure of this semantic
network provides a support to solves some “plasticity
question” such as: “Is there a UIs pre computed for this
task that is tailored for this platform or, more generally
speaking, this context of use?”. The question can be
translated in term of a logical path in the graph starting
from the node describing the task.
4
Conclusion
IDM for HCI clearly separates the UI models depending
on their level of abstraction. While task languages are
well formalized and general enough, CUI languages are
mostly WIMP oriented. We need CUI languages adapted
to each modality. At least, I think it is possible to define
a GUI language that enables post WIMP interaction.
This language should describe stuff like geometry,
colors, textures, behavior, rich inputs, etc. Generating
automatically such description may be difficult.
Moreover, automatically generated UI are poor quality
in general. That’s way we should take into account
tailored UI by capitalizing them in a structure (e.g. a
semantic network) allowing to retrieve them both at
design time and at runtime.
Bibliography
[1] Appert, C. and Beaudouin-Lafon, M. 2006.
SwingStates: adding state machines to the swing
toolkit. In Proceedings of the 19th Annual ACM
Symposium on User interface Software and Technology.
UIST '06. ACM Press, New York, NY, 319-322.
[2] Blanch, R. and Beaudouin-Lafon, M. 2006.
Programming rich interactions using the hierarchical
state machine toolkit. In Proceedings of the Working
Conference on Advanced Visual interfaces. AVI '06.
ACM Press, New York, NY, 51-58.
[3] Calvary G., Coutaz J., Thevenin D., Limbourg Q.,
Bouillon L., Vanderdonckt J., A Unifying Reference
Framework for Multi-Target User Interfaces, Interacting
With Computers, Vol. 15/3, pp 289-308, 2003.
[4] Chatty, S., Sire, S., Vinot, J., Lecoanet, P., Lemort,
A., and Mertz, C. 2004. Revisiting visual interface
programming: creating GUI tools for designers and
programmers. In Proceedings of the 17th Annual ACM
Symposium on User interface Software and Technology.
UIST '04. ACM Press, New York, NY, 267-276.
[5] Demeure A., Calvary G., Coutaz J., Vanderdonckt
J., The COMETs Inspector: Towards Run Time Plasticity
Control Based on a Semantic Network, TAMODIA'2006.
[6] Gajos, K. and Weld, D. S. 2005. Preference
elicitation for interface optimization. In Proceedings of
the 18th Annual ACM Symposium on User interface
Software and Technology. UIST '05. ACM Press, New
York, NY, 173-182.
[7] Lecolinet E., A Brick Construction Game Model for
Creating Graphical User Interfaces: The Ubit Toolkit. In
Proc. INTERACT'99, 1999
[8]
UIML. http://www.uiml.org.
Platforms for Interface Evolution
Jeff Dicker
Abstract
David Cheriton School of Computer Science
User interface design languages need to support interface
evolution: for designers who must provide updated versions, to
be sure, and for users who want to adapt interfaces to their
work habits. At present, despite several decades of advance in
HCI, support for user-controlled interface evolution remains
rudimentary, little more than cosmetic. Why? We argue that the
answer lies not with with design languages themselves, but that
they are severely limited in their expressivity by the event-driven
architectures they assume. Events must be replaced by a more
richly expressive abstraction, a prototype of which is described
in this position paper.
University of Waterloo
200 University Avenue West
Waterloo, ON, Canada, N2L 3G1
jadicker@cgl.uwaterloo.ca
Bill Cowan
David Cheriton School of Computer Science
University of Waterloo
200 University Avenue West
Waterloo, ON, Canada, N2L 3G1
wmcowan@cgl.uwaterloo.ca
Keywords
UIMS, interface evolution, interface platform, adaptation,
customization
ACM Classification Keywords
H.5.2. User interface management systems (UIMS).
Introduction
Copyright is held by the author/owner(s).
CHI 2008, April 5, 2008, Florence, Italy
Recent HCI research has produced modern systems with many
novel interface techniques, including: pen and sketch-based
interfaces, efficient input on handheld devices, multi-touch
displays, and gestured input. However, despite recent research
that could support customization and adaptation, modern
systems make very little and very poor use of these techniques.
The facilitation of these two goals will create a paradigm of
interface evolution: UI's will be shipped to fit all, but will be
tailored by the user to suit the user's individual preferences and
habits. Why, then, do modern interfaces only scrape the
surface of features that support evolving interfaces?
copy command and also define a hotkey. This new, evolved
version of the word processor's interface now has copy, cut, and
paste, instead of just cut and paste.
Where the Current Paradigm Fails
If current implementations are not the state of the art with
respect to HCI research, why are they the state of the art in
modern systems?
Allowing a user to customize their desktop environment is
something that should make them more comfortable with their
computer and more efficient at using it. So, naturally, desktop
customizations have existed since the beginning of the desktop
paradigm. For example, a user can customize the desktop
background, transpose interface elements (including moving
menu items to a toolbar), and toggle the availability of features.
Adaptation has a shorter history, but it has been quite well
studied thanks in HCI research. Despite this, it is still being
applied very poorly (if at all!) to modern systems. One example
is in Microsoft's Office XP: menu items that are not frequently
used are hidden from the user, causing frustration and less
efficient usage [2]. As far back as 1995, Negroponte proposed
a “digital butler” as an interface[1]. The digital butler in Office
XP manifested itself as a paper clip that asks if you would like
tips on writing letters. Why can't the paper clip tell you, “Keep
working, I'm pretty certain your 3 new emails are all spam.”?
These modern evolution features are merely cosmetic, utilizing
little of the power of interface evolution. Imagine being given
two common tools in a word processor: cut and paste. A user
may find that a common pattern is to cut, then paste
immediately to replace the cut text, then paste elsewhere. This
is usually known as a copy command, but assume, for the
purposes of this example, that no copy command exists in the
shipped interface. A truly powerful paradigm of interface
evolution should allow “cut, then paste” to be formed into a new
copy command. Furthermore, the user should be able to
replace the cut button that is in a toolbar in the unmodified
version of the word processor with a new button linked to the
Why the Current Paradigm Fails
The best modern systems use the event-driven GUI toolkits to
satisfy model-view-controller architecture. This platform is not
adequate. Using event-driven widget toolkits was a good
starting point for separating model from view, but fails to provide
an adequate platform for the next generation of user interfaces,
because it does not afford enough power to the view layer. The
glue between model and view in most modern toolkits is an
event system, and event systems are simply callbacks. The
reason that it is so difficult to incorporate useful interface
evolution in current GUIs can be attributed to this: the
functionality of an application has traditionally been exposed
only through allowing a “stupid” view layer to make a specified
function call per each widget event.
Tying a view directly into an application through callbacks
removes the separation between view and model and places
the real interface onus entirely onto the application via event
handlers. Because of this, the only way that a view can truly
evolve is through evolution in the model. This practice of giving
the user a view layer with no knowledge of the application
leaves the user with no tools to evolve a view. Thus, the poor
evolution that does exist in modern systems exists through
functionality that has already been finalized when the
application was programmed.
Facilitating Interface Evolution
The solution to the problem of the “stupid” view layer is to have
a third party manage the calls between view and model. This
third party is called the Interface Manager. If an Interface
Manager is given a set of commands that it can perform on an
application and allows the view layer to make calls to those
commands, the problems associated with implementing
interface evolution techniques go away.
The Interface Manager presents an intelligent view layer to the
user. This view layer does not understand anything about an
underlying application, thus keeping model separate from view,
but it does understand how to call commands in an application.
Past software packages have attempted to create similar
models, but they failed to keep the layer between application
and Interface Manager thick enough. The application will not be
allowed to have event handlers, and neither will the Interface
Manager. The event handlers will be commands sent to
applications in accordance with default viewing specifications
that the user can modify. Conceptually, the Interface Manager
would allow a view to evolve separately from any application.
With the power of this paradigm, a user can evolve an interface
in a way that is entirely independent of applications. This is the
advancement that is required to facilitate interface evolution.
word processor. Combining the commands and changing menu
items now becomes a visual programming problem, not an
architecture problem.
For the Interface Manager to fully succeed, it needs to be
system-wide. For customization, this would allow a user to call,
combine and compose commands from any applications on the
system. For adaptation, this would allow the Interface Manager
to spread an adaptation in one application to others in the
interface.
Back to the Future
Readers who have enough grey in their hair will recognize in the
Interface Manager an old concept, the User Interface
Management System (UIMS). UIMS's died of success, their
best features appropriated by UI toolkits, from Visual Basic to
QT[4]. But, in addition to its widget set, a UIMS also provided a
higher level glue layer than simple events and callbacks. Our
proposal enriches the functionality of that layer, giving it
abstractions that make it programmable even by the end user.
In addition, because a UIMS exports the same concepts to all
applications and their interfaces, adaptations can spread from
one interface/application to another, even, if desired, without
user intervention.
Imagine again the cut and paste scenario. The Interface
Manager can support the user in specifying a new command
that is a combination of two commands the Interface Manager
already knows about, because they are in API's exported by the
References
[1]Negroponte, N. Being Digital. Random House (1995), 149–
159.
[2]McGrenere, J., Baecker, R. M., and Booth, K. S. 2002. An
evaluation of a multiple interface design solution for bloated
software. In Proc. CHI ’02, ACM Press (2002), 164–170.
[3]Gajos, K. Z., Wobbrock, J. O., and Weld, D. S. 2007.
Automatically generating user interfaces adapted to users'
motor and vision capabilities. In Proc UIST '07. ACM Press
(2007), 231-240.
[4]Hudson, S. Personal communication at Graphics Interface
2007.
!
!"##$%&'(")*$+,&#'")-'./)#$-&0"1$/)#'
2/0'3#&0'4)1&02"5&'6&#50$71$/)'
8")*9"*&#
!
!
02'7&*1(89(:-%4(
!
"-2(%!6)$4+,%$(?!
8+&*,(/+)(!'2!I'/&-(+,!V0$+)0+!
PTP!I'..+1+!N4+@!
=+32',3A!=N!LHPGG!
/$0#*+.@#',)W(-2(%@+3-!
!
!!"#$%&'$(
"#$%!&'%$($')!&*&+,!'-(.$)+%!%'/+!0#*..+)1+%!2*0$)1!(#+!
3+4+.'&/+)(!'2!-%+,!$)(+,2*0+!3+%0,$&($')!.*)1-*1+%!
56789:!2',!+/+,1$)1!&'%(;<7=>!$)(+,*0($')!%(?.+%@!7)!
&*,($0-.*,A!$(!&,'&'%+%!(#+!(+,/!!"##$%&'(")*$+,&'
$)(&-."/&#!2',!*!)+B!%-C%+(!'2!(*)1$C.+!$)(+,2*0+%!(#*(!
2+*(-,+!&*%%$4+!&#?%$0*.!('D+)%!*)3!*!)');0')($)-'-%!
.$)D!('!*!3$1$(*.!%?%(+/@!7(!(#+)!3+%0,$C+%!3$22$0-.($+%!$)!
/'3+.$)1!&*%%$4+!(*)1$C.+%!-%$)1!0'//')!6789!
(+0#)$E-+%@!
)*+,-%.#(
"67%A!>*%%$4+!"*)1$C.+!7)(+,2*0+%A!6%+,!7)(+,2*0+!
8+%0,$&($')!9*)1-*1+%!
!/0(/1&##232'&$2-4()*+,-%.#(
FG@H!7)2',/*($')!$)(+,2*0+%!*)3!&,+%+)(*($')!5+@1@A!
FI7:J!6%+,!7)(+,2*0+%@!
54$%-.6'$2-4(
I'&?,$1#(!$%!#+.3!C?!(#+!*-(#',K'B)+,5%:@!
IF7!HLLMA!N&,$.!G!O!N&,$.!PLA!HLLMA!Q.',+)0+A!7(*.?!
NI=!RSM;P;TLGGM;LPH;MKLMKLU@!
"#+,+!$%!.$((.+!(#*(!$%!%(*)3*,3!*C'-(!+/+,1$)1!&'%(;
<7=>!$)(+,*0($')!%(?.+%@!8+%$1)+,%!*)3!,+%+*,0#+,%A!$)!
*)!+22',(!('!0,+*(+!(#+!&,+4$'-%.?!-)$/*1$)+3A!#*4+!
%(,-0D!'-(!$)!*!(#'-%*)3!3$22+,+)(!3$,+0($')%!*)3!(,-.?!
&$')++,+3!*!)+B!1+)+,*($')!'2!#-/*)!0'/&-(+,!
H!
!
$)(+,*0($')@!"#$%!0,+*($4$(?!$%!C'(#!+X#$.*,*($)1!*)3!
$/&',(*)(!2',!(#+!FI7!0'//-)$(?A!C-(!$(!/*?!C+!/*)?!
?+*,%!C+2',+!&'%(;<7=>!(+0#)'.'1?!*)3!$)(+,*0($')!
(+0#)$E-+%!%+((.+!$)('!%(*)3*,3$Y+3!%?%(+/%!(#*(!*,+!
B$3+.?!-%+3!C+?')3!,+%+*,0#!.*C',*(',$+%@!!
N%!&'%(;<7=>!$)(+,2*0+%!/*(-,+A!%(*)3*,3$Y*($')!B$..!
C+0'/+!$)0,+*%$)1.?!$/&',(*)(@!"#$%!$)0.-3+%!
%(*)3*,3$Y*($')!'2!(+0#)'.'1?A!$)(+,*0($')!0')4+)($')A!
*)3!'2!(#+!(''.%!*4*$.*C.+!('!$)(+,2*0+!3+%$1)+,%@!7)!
&*,($0-.*,A!3+%$1)!(''.%!%-0#!*%!-%+,!$)(+,2*0+!
3+%0,$&($')!.*)1-*1+%!56789:!B$..!&.*?!*!0,$($0*.!,'.+A!$)!
&*,(!C+0*-%+!(#+!(*%D!'2!3+%$1)$)1!$)(+,*0($')!B$..!
)+0+%%*,$.?!$)0.-3+!(#+!0'..*C',*($')!'2!/*)?!3$%0$&.$)+%!
%-0#!*%!0'/&-(+,!%0$+)0+A!1,*&#$0*.!3+%$1)A!$)3-%(,$*.!
3+%$1)A!*)3!+.+0(,$0*.!+)1$)++,$)1@!F'B+4+,A!3+%&$(+!
(#+!)++3!2',!&'%(;<7=>!6789%A!$(!$%!$/&',(*)(!('!
,+/+/C+,!(#*(!B+!*,+!%($..!$)!(#+!+*,.?A!0,+*($4+!&#*%+%!
'2!(#$%!)+X(!1+)+,*($')!'2!$)(+,*0($')@!N%!%-0#A!$(!B'-.3!
C+!-)&,'3-0($4+!*)3!,+%(,$0($)1!('!*((+/&(!('!'4+,.?;
%(*)3*,3$Y+!(#+!0')4+)($')%!*)3!.*)1-*1+%!'2!)+X(;
1+)+,*($')!$)(+,*0($')@!"#$%!$%!)'(!('!%*?!(#*(!B+!
%#'-.3)Z(!C+1$)!('!/*D+!*)!+22',(@!![*(#+,A!B+!%#'-.3!
,+/+/C+,!('!,+/*$)!2.+X$C.+!*)3!('!&-(!/',+!+/&#*%$%!
')!3+%0,$C$)1!(#+!(*%D%A!1'*.%A!*)3!*0($')%!'2!(#+!-%+,!
(#*)!')!*)?!&*,($0-.*,!+/+,1$)1!$)(+,*0($')!(+0#)'.'1?@!!
N!0*%+!$)!&'$)(!$%!(*)1$C.+!$)(+,*0($')@!\)+!'2!(#+!/*]',!
,+4+.*($')%!'2!(#+!Q$,%(!7)(+,)*($')*.!I')2+,+)0+!')!
"*)1$C.+!*)3!^/C+33+3!7)(+,*0($')!$%!(#*(!B+!.*0D!*!
%-$(*C.+!3+2$)$($')!2',!(*)1$C.+!$)(+,*0($')@!"+)!?+*,%!
*1'A!7%#$$!*)3!6../+,!&,'&'%+3!(#*(!(*)1$C.+!$)(+,2*0+%!
_B$..!*-1/+)(!(#+!,+*.!&#?%$0*.!B',.3!C?!0'-&.$)1!3$1$(*.!
$)2',/*($')!('!+4+,?3*?!&#?%$0*.!'C]+0(%!*)3!
+)4$,')/+)(%`!aHb@!F'B+4+,A!(#$%!3+2$)$($')!3'+%!)'(!
+)0'/&*%%!*..!(#+!2+*(-,+%!'2!(*)1$C.+!$)(+,2*0+%!
3+4+.'&+3!%$)0+!(#*(!($/+@!Q',!+X*/&.+A!c'3?c-1!adb!$%!
*)!$)(+,*0($4+A!/'($');%+)%$($4+!'C]+0(!$)%&$,+3!C?!2-..;
C'3?!/'4+/+)(%!'2!/'3+,)!3*)0+@!7)!(#$%!%?%(+/!
(#+,+!$%!)'!0'-&.$)1!'2!3$1$(*.!$)2',/*($')!$)!(#+!%+)%+!
(#*(!7%#$$!*)3!6../+,!$/&.?A!?+(!(#+!$)(+,2*0+!#*%!0.+*,!
(*)1$C.+!*((,$C-(+%@!"#$%!$%!]-%(!')+!+X*/&.+!'2!(#+!B*?!
$)!B#$0#!(*)1$C.+!$)(+,*0($')!$%!0')($)-*..?!+4'.4$)1A!(+)!
?+*,%!*2(+,!$(%!0')0+&($')@!N%!B+!*((+/&(!('!2',/*.$Y+!
'-,!3+%0,$&($')%!'2!&'%(;<7=>!$)(+,*0($')A!7!C+.$+4+!$(!$%!
$/&',(*)(!)'(!('!$)*34+,(+)(.?!+X0.-3+!(#$%!%',(!'2!
0,+*($4+!B',D@!!
"#$%!&'%$($')!&*&+,!B$..!2-,(#+,!$..-%(,*(+!(#$%!&'$)(!C?!
$)(,'3-0$)1!(#+!0')0+&(!'2!!"##$%&'(")*$+,&'$)(&-."/&#!
*)3!#$1#.$1#($)1!%'/+!'2!(#+!0#*..+)1+%!*)3!
,+E-$,+/+)(%!2',!*!6789!$)!(#$%!%&*0+@!
;&##2<*(=&4>2"1*(54$*%3&'*#(
7!&,'&'%+!(#+!(+,/!!"##$%&'(")*$+,&'$)(&-."/&!('!
3+%0,$C+!*!0'..+0($')!'2!&*%%$4+!&#?%$0*.!0'/&')+)(%!
B$(#!*!)');0')($)-'-%!.$)D!('!*)!').$)+!%?%(+/@!<$(#!
%-0#!%?%(+/%A!-%+,%!B',D!$)!'22.$)+!%+(($)1%!('!0,+*(+!
&#?%$0*.!/'3+.%!(#*(!,+&,+%+)(!%-0#!(#$)1%!*%!0'/&-(+,!
&,'1,*/%@!\2(+)!(#+!&#?%$0*.!0'/&')+)(%!*,+!
$)+X&+)%$4+!('!&,'3-0+!*)3!/*D+!-%+!'2!&*%%$4+!%+)%',!
(+0#)'.'1?!%-0#!*%!0'/&-(+,!4$%$')!2$3-0$*.%!',![Q78!
(*1%@!>*%%$4+!(*)1$C.+%!%++/!C+%(!%-$(+3!2',!0+,(*$)!
D$)3%!'2!$(+,*($4+!(*%D%!(#*(!$)4'.4+!0?0.+%!'2!3+%$1)A!
(+%($)1A!*)3!,+4$%$')@!^)3!-%+,!0'/&-(+,!&,'1,*//$)1!
$%!*)!'C4$'-%!+X*/&.+!'2!(#$%!%',(!'2!(*%D@!\(#+,!
*&&.$0*($')!3'/*$)%!/$1#(!$)0.-3+!0,+*($)1!/'3+.%!'2!
B',D2.'BA!&,'0+%%!0')(,'.A!',!%$/-.*($')%!(#*(!3')Z(!
,+E-$,+!0')($)-'-%!$)(+,*0($')@!
>*%%$4+!(*)1$C.+%!*,+!3$%($)0(!2,'/!$)(+,2*0+%!%-0#!*%!
7..-/$)*($)1!9$1#(!aGbA!B#$0#!2+*(-,+!$)+X&+)%$4+!
d!
!
&#?%$0*.!0'/&')+)(%!-%+3!('!$)(+,*0(!B$(#!,+*.;($/+!
0'/&-(+,!%$/-.*($')%@!>*%%$4+!(*)1$C.+%!*,+!*.%'!
3$%($)0(!2,'/!$)(+,2*0+%!%-0#!*%!=0e+,)+?Z%!"*)1$C.+!
>,'1,*//$)1!c,$0D%!aUb!$)!B#$0#!*0($4+!+.+0(,')$0!
0'/&')+)(%!*,+!+/C+33+3!$)!(#+!&#?%$0*.!+.+/+)(%!'2!
(#+!$)(+,2*0+@!<#$.+!&*%%$4+!(*)1$C.+!$)(+,2*0+%!%*0,$2$0+!
%'/+!'2!(#+!,+*.;($/+!$)(+,*0($4$(?!'2!').$)+!5',!*0($4+:!
(*)1$C.+!%?%(+/A!(#+?!*.%'!'22+,!*!)-/C+,!'2!*&&+*.$)1!
*34*)(*1+%@!Q',+/'%(A!&*%%$4+!(*)1$C.+!%?%(+/%!
,+&,+%+)(!*)!*22',3*C.+A!,'C-%(A!*)3!&',(*C.+!
*.(+,)*($4+!('!*0($4+!(*)1$C.+!%?%(+/%@!"#$%!/*D+%!(#+/!
$3+*.!2',!-%+!$)!+3-0*($')*.!%+(($)1%!B#+,+!0'%(!$%!
*.B*?%!*!2*0(',!*)3!(+0#)'.'1?!(#*(!$%!)'(!3+&+)3*C.+!
(+)3%!('!1*(#+,!3-%(!$)!(#+!0',)+,@!>*%%$4+!(*)1$C.+!
%?%(+/%!/*?!*.%'!1$4+!$)(+,*0($')!3+%$1)+,%!1,+*(+,!
2,++3'/!('!0#''%+!/*(+,$*.%!*)3!2',/%!(#*(!/*D+!
%+)%+!2',!*)!*&&.$0*($')!,*(#+,!(#*)!(#+!(+0#)'.'1?!
-%+3!('!$/&.+/+)(!$(@!
Q$1-,+!P@!"+,)!"*)1$C.+!>,'1,*//$)1!I')%'.+!*(!(#+!=-%+-/!
'2!V0$+)0+A!c'%(')!
\)+!+X*/&.+!'2!*!&*%%$4+!(*)1$C.+!%?%(+/!$%!"+,)!aPbA!*!
(*)1$C.+!&,'1,*//$)1!.*)1-*1+!B+!*,+!3+4+.'&$)1!2',!*!
,'C'($0%!+X#$C$(!*(!(#+!=-%+-/!'2!V0$+)0+!$)!c'%(')@!7)!
(#$%!0*%+A!(#+!&#?%$0*.!0'/&')+)(%!'2!(#+!$)(+,2*0+!*,+!
)'(#$)1!/',+!(#*)!B''3+)!C.'0D%!%#*&+3!.$D+!]$1%*B!
&-YY.+!&$+0+%!B$(#!0$,0-.*,!C*,0'3+;.$D+!%?/C'.%!&,$)(+3!
')!(#+/@!f$%$(',%!('!(#+!+X#$C$(!0,+*(+!&,'1,*/%!('!
0')(,'.!*!,'C'(!C?!0'))+0($)1!(#+!C.'0D%!('1+(#+,!*)3!
&,+%%$)1!*!01)'23'4-5*-"6!C-((')@!"#+!+X#$C$(!-%+%!*!
3$1$(*.!0*/+,*!('!0*&(-,+!#$1#;,+%'.-($')!%($..!$/*1+%!'2!
4$%$(',%Z!&,'1,*/%A!B#$0#!$(!(#+)!0')4+,(%!$)('!3$1$(*.!
$)%(,-0($')%!2',!(#+!,'C'(@!f$%$(',%!/$1#(!0,+*(+!*)3!(+%(!
')+!',!(B'!&,'1,*/%!&+,!/$)-(+@!\(#+,B$%+A!(#+!
$)(+,2*0+!$%!'22.$)+@!
?5@A(/7&11*4>*#((
"#+,+!*,+!%+4+,*.!0#*..+)1+%!$)4'.4+3!$)!*3*&($)1!6789%!
2',!-%+!B$(#!&*%%$4+!(*)1$C.+!%?%(+/%@!\)+!&,'C.+/!$%!
(#+!-%+!'2!%(*(+;C*%+3!',!+4+)(;,+%&')%+!3+%0,$&($')!
2',/*(%@!Q',!+X*/&.+A!B$(#!(#+!"+,)!$)(+,2*0+!(#+,+!*,+!
').?!(B'!%(*(+%J!(#+!*-(#',$)1!%(*(+!5B#+,+!(#+!-%+,!
0')%(,-0(%!&,'1,*/%:!*)3!(#+!'C%+,4$)1!%(*(+!5B#+,+!
(#+!-%+,!B*(0#+%!(#+!,'C'(!*0(!'-(!*!&,'1,*/:@!
F'B+4+,A!(#+%+!(B'!%(*(+%!0*)!C+!*0($4+!
%$/-.(*)+'-%.?A!*)3!(#$%!3+%0,$&($')!3'+%!.$((.+!('!
+.-0$3*(+!(#+!%?%(+/@!9$D+B$%+A!(#+!-%+,!+4+)(K%?%(+/!
,+%&')%+!/'3+.!$%!.*,1+.?!$)*&&.$0*C.+@!Q,'/!(#+!
+4+)(K,+%&')%+!/'3+.A!"+,)!$%!4+,?!%$/&.+J!(#+!-%+,!
&,+%%+%!*!0'/&$.+!C-((')A!*)3!(#+!%?%(+/!,+%&')3%!C?!
(,*)%/$(($)1!(#+!-%+,Z%!&,'1,*/!('!*!,'C'(@!"#+!
&,'C.+/!$%!(#*(!*./'%(!*..!'2!(#+!/+*)$)12-.!
$)(+,*0($')g1,'-&%!'2!-%+,%!.+*,)$)1!*)3!.+4+,*1$)1!
(#+!&#?%$0*.!%?)(*X!('!0')%(,-0(!&,'1,*/%!*)3!,+4$%$)1!
*.1',$(#/%!*2(+,!'C%+,4$)1!(#+!,'C'(Z%!*0($')%g$%!.'%(!$)!
(#$%!3+%0,$&($')@!N!2$)*.!&,'C.+/!$%!(#*(!/-0#!'2!(#+!
-%+,!$)(+,*0($')!$%!-)&,+3$0(*C.+!*)3!$3$'%?)0,*($0!
C+0*-%+!'2!(#+!,$0#)+%%!'2!$)(+,*0($)1!B$(#!(#+!&#?%$0*.!
B',.3@!Q',!+X*/&.+A!*!-%+,!/$1#(!0#''%+!('!/566&)(!*!
U!
!
&#?%$0*.!0'/&-(+,!&,'1,*/!C?!%($0D$)1!&'%(;$(!)'(+%!
)+X(!('!/+*)$)12-.!C.'0D%@!!
/-4'16#2-4(
"#+!(*%D!'2!2',/*..?!3+%0,$C$)1!&'%(;<7=>!$)(+,*0($')!$%!
3*-)($)1!C-(!B',(#B#$.+@!7)!(#+!&,'0+%%!'2!3+4+.'&$)1!
3+%0,$&($')!.*)1-*1+%A!$(!$%!$/&',(*)(!('!2'0-%!')!-%+,!
(*%D%A!1'*.%A!*)3!*0($')%A!,*(#+,!(#*)!')!%&+0$2$0!
(+0#)'.'1$+%!',!$)(+,*0($')!&*,*3$1/%@!"#$%!$%!C'(#!
C+0*-%+!(#+!+X(+)(!&'%(;<7=>!$)(+,*0($')!$%!%($..!
*0($4+.?!C+$)1!3+2$)+3!*)3!C+0*-%+!)+B!*&&,'*0#+%!('!
$)(+,*0($')!B$..!$)(,'3-0+!%-,&,$%$)1!)+B!0#*..+)1+%!2',!
2',/*.!3+%0,$&($')%@!
'
BCDCBCE/C8(
P@! F',)A!=@V@!*)3!h*0'CA![@h@i@!"*)1$C.+!>,'1,*//$)1!
$)!(#+!I.*%%,''/!B$(#!"+,)@!7)!4-5/7'89:';<<=A!NI=!
>,+%%!5HLLS:@!
H@! 7%#$$A!F@!*)3!6../+,A!c@!"*)1$C.+!C$(%J!('B*,3%!
%+*/.+%%!$)(+,2*0+%!C+(B++)!&+'&.+A!C$(%A!*)3!
*('/%@!7)!4-5/7'89:'>??=@'NI=!>,+%%!5PRRS:A!HdU;
HUP@!
d@! ='+)A!h@!Q,'/!#*)3;#+.3!('!C'3?;B',)J!+/C'3$+3!
+X&+,$+)0+%!'2!(#+!3+%$1)!*)3!-%+!'2!*!B+*,*C.+!
/'4+/+)(;C*%+3!$)(+,*0($')!0')0+&(@!7)!4-5/7'AB:'
;<<=@'NI=!>,+%%!5HLLS:A!HGP;HGM@!
U@! =0e+,)+?A! "@V@! Q,'/! (-,(.+%! ('! (*)1$C.+!
&,'1,*//$)1! C,$0D%J! +X&.',*($')%! $)! &#?%$0*.!
.*)1-*1+! 3+%$1)@! 7)! 4&-#5)",' ")C' D+$E1$(51#'
856!1($)*@'MJdHT;ddSA!HLLU@!
G@! 6)3+,D'22.+,A! h@! *)3! 7%#$$A! F@! 7..-/$)*($)1! .$1#(J! *!
0*%-*.! '&($0%! B',DC+)0#@! 7)! 4-5/7' 89:' >???'
&F(&)C&C'"+#(-"/(#@'NI=!>,+%%!5PRRR:A!G;T!
'
!
C. UIDLs for Multimodal and Ubiquitous Computing
Mir Ali, Dale Russell, Kibum Kim, Zhuli Xie
Motorola Labs
Dynamic User Interface Creation based on Device Descriptions
Cristian Bogdan†‡, Hermann Kaindl‡, Jürgen
Falb‡
†Royal Institute of Technology (KTH), ‡Vienna
University of Technology
Discourse-based Interaction Design for Multi-modal User Interfaces
Ladry Jeran Francois, Philippe Palanque,
Sandra Basnyat, Eric Barboni, David
Navarre
IRIT University Paul Sabatier
Dealing with Reliability and Evolvability in Description Techniques
for Next Generation User Interfaces
Bruno Dumas, Denis Lalanne, Rolf Ingold
University of Fribourg
Prototyping Multimodal Interfaces with the SMUIML Modeling
Language
Jair Leite, Antonio Cosme
UFRN
XSED: notations to describe status- event ubiquitous computing
systems
Fabio Paterno, Carmen Santoro
ISTI-CNR
UIDLs for Ubiquitous Environments
!
Dynamic User Interface Creation based
on Device Descriptions
!
!
02%(5&%--6(!12(
)2"9:()2:(
!!"#$%&'$(
E/0(#!"#$&-(3$62#!Q&)&(-3%.!
E/0(#!"#$&-(3$62#!Q&)&(-3%.!
>2$2-27(!R(1)!
>2$2-27(!R(1)!
OJSF!T:!M782#?/6#!Q4:!!
OJSF!T:!M782#?/6#!Q4:!!
U3%(/01/-8.!"R!VKOSV.!AUM!
U3%(/01/-8.!"R!VKOSV.!AUM!
*(-22?:(76W02$2-27(:320!
961/0:960W02$2-27(:320!
!
(
7&1*(89##*11(
;<912(=2*(
E/0(#!"#$&-(3$62#!Q&)&(-3%.!
E/0(#!"#$&-(3$62#!Q&)&(-3%.!
>2$2-27(!R(1)!
>2$2-27(!R(1)!
OJSF!T:!M782#?/6#!Q4:!!
OJSF!T:!M782#?/6#!Q4:!!
U3%(/01/-8.!"R!VKOSV.!AUM!
U3%(/01/-8.!"R!VKOSV.!AUM!
4(7&:-/))&77W02$2-27(:320!
X%/76:=6&W02$2-27(:320!
"#!$%&!'()$!*&+!,&(-).!$%&!#/01&-!2*!32#)/0&-!4&563&)!
(#4!(''76(#3&)!'2))&))6#8!#&$+2-96#8!3('(1676$6&)!%()!
6#3-&()&4!4-(0($63(77,:!;#!2#&!%(#4.!+&!%(5&!#2#<
02167&!(''76(#3&)!769&!'%2$232'6&-).!'-6#$&-).!0&46(!
)$2-(8&!4&563&).!)2/#4!),)$&0)!(#4!$&7&56)62#).!+%63%!
(-&!&=$-&0&7,!'2+&-*/7!+6$%!(!7(-8&!#/01&-!2*!
*&($/-&):!>(#,!2*!$%&)&!4&563&)!(-&!3('(17&!2*!
(45&-$6)6#8!$%&6-!*&($/-&)!(#4!*/#3$62#(76$,!6#!)20&!)2-$!
2*!4&37(-($65&!)'&36*63($62#:!;#!$%&!2$%&-!%(#4.!+&!%(5&!
3&77!'%2#&).!+%63%!(-&!(7)2!6#3-&()6#87,!'2+&-*/7!(#4!
%(5&!1&320&!'&-5()65&!(#4!/16?/6$2/)!+6$%!$%&6-!
#&$+2-96#8!3('(1676$6&):!"#!$%6)!'2)6$62#!'('&-.!+&!
'-&)&#$!2/-!56&+!2*!4,#(063(77,!8&#&-($6#8!/)&-!
6#$&-*(3&)!@A"B!2#!$%&!3&77!'%2#&!*2-!#&+7,!46)325&-&4!
4&563&)!1()&4!2#!$%&6-!4&37(-($65&!4&563&!4&)3-6'$62#):!
C%&!4,#(063(77,!3-&($&4!A"!3(#!$%&#!1&!/)&4!$2!32#$-27!
(#4!2'&-($&!$%2)&!4&563&):!
!
)*+,-%.#(
D,#(063!A"!3-&($62#.!4&37(-($65&!A"!)'&36*63($62#.!
4&563&!4&)3-6'$62#:!
G2',-68%$!6)!%&74!1,!$%&!(/$%2-I2+#&-@)B:!
!/0(/1&##232'&$2-4()*+,-%.#(
GE"!JKKL.!M'-67!F!N!OK.!JKKL.!P72-&#3&.!"$(7,!
EF:0:!"#*2-0($62#!6#$&-*(3&)!(#4!'-&)&#$($62#!@&:8:.!
EG"BH!!
MG>!O<==================:!
J!
!
>4$%-.9'$2-4(
"#!-&3&#$!,&(-).!$%&-&!%()!1&&#!(#!2#826#8!&**2-$!$2!
)$(#4(-46X&!(33&))!(#4!32#$-27!2*!(''76(#3&):!AY#Y!
ZO[\.!EM]6!Z^\.!_68`&&!ZF\.!`7/&$22$%!ZJ\!&$3:!(-&!)20&!
2*!$%&!)$(#4(-4)!$%($!'-2564&!8/64&76#&)!*2-!#&$+2-9&4!
(33&))!(#4!32#$-27!2*!(''76(#3&)!(#4!4&563&):!P2-!
&=(0'7&.!`7/&$22$%a)!)&-563&!46)325&-,!'-2$2327!@UDYB!
'-2564&)!(#!MY"!*2-!(''763($62#)!$2!46)325&-!$%&!)&$!2*!
)&-563&)!$%($!(-&!(5(67(17&!1()&4!2#!QP!'-2=606$,!2*!
4&563&):!DRbM!ZO\!6)!(!AY#Y<1()&4!6#$&-2'&-(1676$,!
*-(0&+2-9!$%($!)&&9)!$2!$6&!$%-&&!46**&-&#$!420(6#)H!
YGI"#$&-#&$.!02167&!4&563&).!(#4!32#)/0&-!&7&3$-2#63).!
$2!&#%(#3&!$%&!)%(-6#8!(#4!46)$-61/$62#!2*!4686$(7!0&46(!
6#!(!%20&!32#$&=$:!C%&!*-(0&+2-9!32#)6)$)!2*!
8/64&76#&)!*2-!6#$&-2'&-(1676$,!(#4!)$(#4(-4)!*2-!5(-62/)!
0&46(!*2-0($).!0&46(!0(#(8&0&#$.!4&563&!46)325&-,!
(#4!32#$-27!&$3:!c6$%!$%&!(5(67(1676$,!2*!$%&)&!)$(#4(-4).!
(#4!4&563&)!)/''2-$6#8!$%&0.!6$!6)!'2))617&!*2-!5(-62/)!
(''76(#3&)!(#4!4&563&)!$2!3200/#63($&!+6$%!&(3%!2$%&-!
6#!(!46)$-61/$&4!*()%62#:!!
M#2$%&-!32#3/--&#$!$-&#4!6)!$%&!/16?/6$2/)!(5(67(1676$,!
(#4!/)(8&!2*!3&77!'%2#&):!!G&77!'%2#&!/)(8&!%()!-('647,!
6#3-&()&4.!(#4!6$!6)!&)$60($&4!$%($!$%&-&!+677!1&!02-&!
$%(#!J:F!167762#!3&77!'%2#&)!+2-74+64&!1,!JKK^!ZOK\:!P2-!
02)$!/)&-).!$%&!3&77!'%2#&!%()!1&320&!(#!
6#46)'&#)(17&!(#4!/16?/6$2/)!(33&))2-,:!>(#,!2*!$%&)&!
'%2#&)!320&!&?/6''&4!+6$%!4($(<#&$+2-96#8!
3('(1676$6&)!(72#8!+6$%!$%&!5263&!3%(##&7:!M)!'&2'7&!
025&!6#!#&+!&#56-2#0&#$)!32#$(6#6#8!#&$+2-9&4!
4&563&)!$%($!(45&-$6)&!$%&6-!3('(1676$6&).!6$!6)!-&()2#(17&!
$2!$%6#9!2*!/)6#8!3&77!'%2#&)!$2!32#$-27!(#4!2'&-($&!
$%&)&!4&563&).!$%&-&1,!$-&($6#8!3&77!'%2#&)!()!-&02$&!
32#$-27):!M)!$%&!*/#3$62#(76$,!2*!32#)/0&-!'-24/3$)!
&='(#4).!$%&,!(-&!1&3206#8!02-&!320'763($&4.!)2!$%($!
(!/)&-!0(,!1&!25&-+%&70&4!+6$%!(!#&+!4&563&:!!M!A"!
$%($!0(')!$%&!/)&-a)!82(7)!$2!$%&!*/#3$62#)!2*!$%&!
4&563&!ZL\!3(#!8-&($7,!)60'76*,!$%&!7&(-#6#8!&='&-6&#3&:!
A)6#8!(!0/7$6024(7!*-2#$<&#4!$2!%(#47&!$%&!/)&-a)!
3200(#4)!56(!)'&&3%.!$2/3%.!(#4I2-!9&,12(-4.!(#4!(!
-&()2#6#8!),)$&0!2#!$%&!1(39&#4!$2!4&$&-06#&!+%63%!
*/#3$62#)I)&-563&)!$%&!/)&-!6)!$-,6#8!$2!6#529&.!+677!
0(9&!$%&!A"!02-&!&**&3$65&!(#4!/)&-<*-6&#47,:!"#!$%6)!
'('&-.!+&!'-&)&#$!2#&!56&+!2*!4,#(063(77,!3-&($6#8!A")!
2#!$%&!3&77!'%2#&!1()&4!2#!$%&!46)325&-,!2*!#&+!
4&563&).!(772+6#8!$%&!3&77!'%2#&!$2!1&!$-&($&4!()!(!
/#65&-)(7!-&02$&!32#$-27:!!
C%&!*2772+6#8!)3&#(-62!4&02#)$-($&)!$%&!02$65($62#!*2-!
2/-!(''-2(3%H!!!
>(-,! 6)! ($$&#46#8! (! $&3%#63(7! $(79! 6#! (#!
/#*(0676(-! 32#5&#$62#! 3&#$&-:! U%&! 6)!
6#$&-&)$&4! 6#! $%&! $(79! (#4! +(#$)! $2! 8&$! $%&!
'-&)&#$($62#! )764&):! U%&! $(9&)! 2/$! %&-! 3&77!
'%2#&! (#4! 9#2+6#8! $%($! $%&! )764&)! '-21(17,!
-&)64&! 2#! )20&! #&(-1,! 0&46(! )&-5&-.! 42&)! (!
)&(-3%! *2-! 46)325&-(17&! 0&46(! )&-5&-):! M*$&-!
46)325&-6#8! (! 0&46(! )&-5&-.! )%&! 6)! (17&! $2!
1-2+)&! $%&! 32#$&#$)! (#4! *6#4! $%&! -&7&5(#$!
)764&)!1()&4!2#!$%&!$6$7&!2*!$%&!$(79:!U%&!$%&#!
)&(-3%&)! *2-! 46)325&-(17&! '-6#$&-)! 6#! %&-!
5636#6$,! (#4! 6)! (17&! $2! '-6#$! $%&! )764&)! 2#! (!
'-6#$&-!2/$)64&!$%&!32#*&-&#3&!-220:!
"#!$%6)!)3&#(-62.!$%&-&!6)!#2$!#&3&))(-67,!(!1/67$<6#!A"!*2-!
2'&-($6#8!$%&!0&46(!)&-5&-!(#4!$%&!'-6#$&-!2#!>(-,a)!
3&77!'%2#&:!C%&!A"!*2-!2'&-($6#8!$%&)&!4&563&)!6)!
4,#(063(77,!3-&($&4!1()&4!2#!$%&!4&563&!$,'&!(#4!
4&)3-6'$62#!$%($!6)!&=$-(3$&4!*-20!&(3%!2*!$%&!4&563&)!
!"#$%!$%&6-!46)325&-,:!!C2!*(3676$($&!$%&!'-23&))!2*!
3-&($6#8!$%&!A".!+&!/)&!(!4&37(-($65&!*-(0&<1()&4!A"!
[!
!
-&'-&)&#$($62#!7(#8/(8&!$%($!(772+)!b($/-(7!R(#8/(8&!
6#$&-(3$62#!ZOK\:!C%&!4&$(67)!2*!$%&!A"!4&)3-6'$62#!
7(#8/(8&)!(#4!$%&!$-(#)*2-0($62#!'-23&))!$%($!/$676X&)!
$%&!4&563&!4&)3-6'$62#)!$2!3-&($&!$%&!A"!(-&!'-&)&#$&4!
1&72+:!
J:
8*1&$*.(,-%?(
c2-9!%()!1&&#!42#&!/#4&-!$%&!Y&117&)!'-2d&3$!
@%$$'HII+++:'&117&):%366:30/:&4/IB!6#!(/$20($63(77,!
8&#&-($6#8!A")!2#!%(#4%&74!4&563&)!$2!32#$-27!
%2/)&%274!(''76(#3&)!ZOJ\:!!E2+&5&-.!$%&!A")!$%($!(-&!
8&#&-($&4!(-&!'-60(-67,!8-('%63(7:!!Q($%&-!$%(#!d/)$!
eA").!2/-!+2-9!*23/)&)!2#!3-&($6#8!0/7$6024(7!A").!
6#37/46#8!)'&&3%!"I;:!C%&!UAYYRT!),)$&0!(/$20($63(77,!
8&#&-($&)!32#)6)$&#$!A")!/)6#8!2'$606X($62#!$&3%#6?/&)!
ZV\:!!C%&!Q2(46&!),)$&0!6)!(!82(7<2-6&#$&4!),)$&0!$%($!
'-&)&#$)!(!A"!$2!$%&!/)&-!6#!$&-0)!2*!$%&!/)&-a)!82(7).!
+%63%!3(#!1&!46**&-&#$!*-20!$%&!4&563&a)!*/#3$62#(76$,!
(#4!$%&!(3$/(7!32#$-27)!6$!'-2564&)!ZS\:!A">R!ZO\.!
A)6f>R!ZL\!(#4!C&-&)(!Z[\!(-&!+&77<9#2+#!f>R<1()&4!
A"DR):!E2+&5&-.!865&#!$%&!21d&3$65&)!2*!2/-!'-2d&3$.!
4&)3-61&4!1&72+.!2/-!*-(0&<1()&4!A"DR!+()!(!02-&!
-&()2#(17&!3%263&:!!
8*#*&%'<(>##9*#@A&%&:*$*%#(
U20&!2*!$%&!4&)6-&4!*&($/-&)!2*!$%&!4,#(063(77,!
8&#&-($&4!A")!(-&H!
O:
425#',+/!5'#6! <! ;#&! 2*! $%&! 4&)6-&4! 3('(1676$6&)! 6)!
$%&! 8&#&-($62#! 2*! (! 0/7$6024(7! A":! P2-! &=(0'7&.!
$%&! &#4! /)&-! 068%$! +(#$! $2! 32#$-27! $%&! #&+7,!
46)325&-&4! 4&563&! /)6#8! 12$%! )'&&3%! (#4! (!
8-('%63(7! /)&-! 6#$&-*(3&:! C%&! 8&#&-($&4! A"! 32/74!
(7)2! )/''2-$! #($/-(7! 7(#8/(8&! '-23&))6#8! @bRYB! $2!
(772+!)60'7&-!(#4!02-&!6#$/6$65&!)'&&3%!6#$&-(3$62#!
1&$+&&#!$%&!/)&-!(#4!$%&!4&563&:!
J:
7$%.+(!5'8!#'+(! <! "*! 6#*2-0($62#! 6)! (5(67(17&! (12/$!
$%&! /)&-g)! /)(8&! 2*! 2$%&-! 4&563&)! (#4! '&-)2#(7!
'-&*&-&#3&).! $%6)! 32/74! '2$&#$6(77,! 1&! /)&4! $2!
'&-)2#(76X&!$%&!8&#&-($&4!A":!!
[:
9+(#$:#;.$(.'#'<'#6! <! "$! 068%$! 1&! /)&*/7! (#4I2-!
#&3&))(-,! $2! 6#32-'2-($&! $%&! #2$62#! 2*! h32#$&=$i!
6#$2! $%&! 8&#&-($&4! A"! $2! (332/#$! *2-! $%&! 32#$&=$<
)&#)6$65&! #($/-&! 2*! 3&-$(6#! 4&563&! 3('(1676$6&):! c&!
(-&! ())/06#8! $%($! 6$! 6)! '2))617&! $2! 87&(#! )/3%!
6#*2-0($62#!*-20!$%&!4&563&!/'2#!6$)!46)325&-,:!
j:
=$($%!5'#6* !(/* .1$-'"'-'#6* N! "*! (! /)&-! 3(#! /)&!
0/7$6'7&! 4&563&)! )60/7$(#&2/)7,.! 6$! 0(,! 1&!
4&)6-(17&! $2! '-&)&#$! $%&! /)&-! +6$%! (! 32002#! A"!
+%63%! 42&)! #2$! 46)$6#8/6)%! +%63%! */#3$62#)I(3$62#)!
(-&! '&-*2-0&4! 1,! +%63%! 4&563&):! ! M7$&-#($65&7,.! (!
/)&-!0(,!)'&36*63(77,!-&?/&)$!(!3&-$(6#!(3$62#!$2!1&!
'&-*2-0&4! 2#! (! )'&36*63! 4&563&:! ! C%&! 4,#(063(77,!
U20&!2*!$%&!-&)&(-3%!6))/&)!())236($&4!+6$%!$%6)!
&#4&(52-!(-&!46)3/))&4!1&72+H!
O:
&'""$%'()* "+%,* "!-#+%.* !(/0+%* -+,12#'()* 1+3$%! <!
C%&!4&563&)!32/74!%(5&!+64&7,!46**&-6#8!3('(1676$6&).!
320'/$6#8! '2+&-! (#4I2-! *&($/-&):! D&563&)! 0(,!
-(#8&! 6#! 320'7&=6$,! *-20! (! 768%$! )+6$3%.! +6$%! $%&!
)27&!3('(1676$,!2*!)+6$3%6#8!$%&!768%$!2#!2-!2**.!$2!(!
0/7$6<*/#3$62#(7! 4&563&! )/3%! ()! (#! (77<6#<2#&! *(=!
0(3%6#&.! 32'6&-.! )3(##&-.! (#4! '-6#$&-:! ;1562/)7,.!
$%&!A"!*2-!$%&!7($$&-!+677!1&!0/3%!02-&!320'763($&4!
$%(#!*2-!$%&!*2-0&-:!
425#'15$* .#!(/!%/.! <! "#! (! %&$&-28&#&2/)!
&#56-2#0&#$! 6$! 6)! #&3&))(-,! $2! (3320024($&!
0/7$6'7&! )$(#4(-4)! 2*! 4&563&! 46)325&-,! (#4!
4&)3-6'$62#.! +%&-&! 46**&-&#$! 4&563&)! 4&)3-61&! (#4!
(45&-$6)&! $%&6-! 3('(1676$6&)! /)6#8! 46**&-&#$!
)$(#4(-4):!!"#!)20&!3()&).!$%&!4&563&)!068%$!%(5&!
6#320'7&$&! 2-! #2#<320'76(#$! 4&563&! 4&)3-6'$62#):!
C%&!A"!8&#&-($2-!0/)$!$%&#!1&!-21/)$.!+6$%!&#2/8%!
6#$&7768&#3&! $2! (332/#$! *2-! )/3%! )%2-$3206#8)! (#4!
)$677!8&#&-($&!(!-&()2#(17&!A":!
j!
!
8&#&-($&4! A"! )%2/74! (3320024($&! 12$%! 8&#&-(7!
(#4!4&563&<)'&36*63!/)&-!3200(#4):!!!
C%&!(125&!76)$!&#/0&-($&)!(!*&+!2*!$%&!-&)&(-3%!6))/&)!
(#4!'(-(0&$&-)!$%($!(**&3$!$%&!8&#&-($62#!2*!$%&!A"!*2-!
#&+7,<46)325&-&4!4&563&):!!!
7+4&:2'(B>('%*&$2-4(C%-'*##(
M)!(#!6#6$6(7!)$(-$6#8!'26#$.!+&!(-&!7606$6#8!2/-)&75&)!$2!
AY#Y!&#(17&4!4&563&):!C%&!AY#Y!*2-/0!
@%$$'HII/'#':2-8IB!'-2564&)!8/64&76#&)!*2-!5(-62/)!
4&563&):!T(3%!AY#Y<32#*2-0(#$!4&563&!0/)$!(4%&-&!$2!
$%&)&!8/64&76#&).!+%63%!6#37/4&!$%&!4&563&!4&)3-6'$62#:!
C%6)!6)!(#!f>R<1()&4!-&'-&)&#$($62#!$%($!6#463($&)!$%&!
0(#/*(3$/-&-.!4&563&!"D.!(#4!)20&!2$%&-!0&$(4($(!
(12/$!$%&!4&563&:!Y(-$!2*!$%&!6#*2-0($62#!6$!32#$(6#)!6)!
$%&!)&-563&!4&)3-6'$62#:!C%&!)&-563&!4&)3-6'$62#!'-2564&)!
6#*2-0($62#!(12/$!+%($!(3$62#)!$%&!4&563&!3(#!&=&3/$&.!
(#4!+%($!(-8/0&#$)!(-&!-&?/6-&4!*2-!&(3%!(3$62#:!C%&!
)&?/&#3&!2*!)$&')!7&(46#8!/'!$2!$%&!A"!3-&($62#!6)!
)%2+#!6#!P68/-&!O:!!
!
D(
D&563&!
D6)325&-,!
E(
e&$!D&563&!!
(#4!U&-563&!!
D&)3-6'$62#!
T(3%!AY#Y!4&563&!'&-62463(77,!1-2(43()$)!6$)!'-&)&#3&:!
c%&#!$%&!4&563&!6)!46)325&-&4.!+&!&=$-(3$!$%&!4&563&!
(#4!)&-563&!4&)3-6'$62#)!*-20!$%&!4&563&!(#4!/)&!$%&)&!
$2!4,#(063(77,!8&#&-($&!(!A".!+%63%!3(#!$%&#!1&!/)&4!
$2!32#$-27!$%&!4&563&:!C%6)!6)!)%2+#!6#!P68/-&!J:!
AY#Y!@D&563&!>24&7B!
D&563&!
D&)3-6'$62#!
U&-563&!
D&)3-6'$62#!
fURC!
;$%&-!
>24&7)!
D&37(-($65&!A"!
-&'-&)&#$($62#!
!
P68/-&!JH!M/$20($63!A"!8&#&-($62#!'-23&))!
M)!0&#$62#&4!6#!$%&!6#$-24/3$62#.!+&!/)&!(!*-(0&<
1()&4!4&37(-($65&!A"!-&'-&)&#$($62#!*2-!$%&!8&#&-($&4!
A".!)6067(-!6#!)$-/3$/-&!$2!]263&f>R:!c&!/)&!$%&!f>R!
$-(#)*2-0($62#!7(#8/(8&!@fURCB!$2!4,#(063(77,!3-&($&!
$%&!A".!1()&4!2#!$%&!AY#Y!)&-563&!4&)3-6'$62#:!;$%&-!
6#*2-0($62#!/)&4!$2!&#%(#3&!$%&!8&#&-($&4!A"!6)!
&=$-(3$&4!*-20!5(-62/)!024&7):!
>42$2&1(1*##-4#(1*&%4*.(
G(
D,#(063!A"!
8&#&-($62#!
F(
D&563&!
G2#$-27!
!
P68/-&!OH!U&?/&#3&!2*!)$&')!*2-!A"!8&#&-($62#!
;/-!6#6$6(7!&='&-6&#3&!6#!+2-96#8!+6$%!AY#Y<&#(17&4!
4&563&)!%()!1&&#!$%($!6$!6)!'2))617&!$2!3-&($&!A")!1()&4!
)27&7,!2#!$%&!4&563&!(#4!)&-563&!4&)3-6'$62#)!$%($!(-&!
-&$-6&5&4!*-20!$%&!4&563&).!1/$!$%&!A")!(-&!#2$!5&-,!
/)(17&:!C%&!4&563&!4&)3-6'$62#)!#&&4!$2!1&!
)/''7&0&#$&4!1,!2$%&-!024&7)!(#4!6#*2-0($62#!*2-!$%&!
A"!8&#&-($62#!'-23&))!)2!$%($!$%&!8&#&-($&4!A")!(-&!
F!
!
02-&!/)(17&.!(#4!(7)2!(3%6&5&!$%&!82(7)!$%($!+&!
2/$76#&4!&(-76&-!*2-!$%&!8&#&-($&4!A"):!
[5] T8(#.!D:!!C%&!&0&-8&#3&!2*!_68`&&!6#!1/6746#8!
(/$20($62#!(#4!6#4/)$-6(7!32#$-27!G20'/$6#8!m!G2#$-27!
T#86#&&-6#8!l2/-#(7.!OVHOj<OS.!JKKF:!!!
/-4'19#2-4(
[6] e(d2).!p:.!c/.!M:.!(#4!c&74.!D:!!G-2))!D&563&!
G2#)6)$&#3,!6#!M/$20($63(77,!e&#&-($&4!A)&-!
"#$&-*(3&).!Y-23&&46#8)!2*!$%&!J#4!c2-9)%2'!2#!>/7$6<
A)&-!(#4!A16?/6$2/)!A)&-!"#$&-*(3&):!JKKF.!'':!^<L:!
"#!$%6)!'('&-.!+&!%(5&!'-&)&#$&4!(!56)62#!*2-!$%&!
(/$20($63!3-&($62#!2*!A")!*2-!46**&-&#$!4&563&)!6#!(#!
6#$&7768&#$!*()%62#.!(772+6#8!$%2)&!4&563&)!$2!1&!
32#$-277&4!$%-2/8%!(!3&77!'%2#&:!c&!%(5&!46)3/))&4!(!
#/01&-!2*!-&)&(-3%!6))/&)!(#4!3%(77&#8&)!())236($&4!
+6$%!$%6)!56)62#:!`()&4!2#!)20&!'-&7606#(-,!+2-9.!+&!
%(5&!*2/#4!$%($!02-&!6#*2-0($62#!6)!#&&4&4!1&)64&)!$%&!
4&563&!4&)3-6'$62#)!$2!3-&($&!/)(17&!A").!1/$!$%&!/)&!2*!
4&37(-($65&!A"!-&'-&)&#$($62#)!%&7')!/)!25&-320&!)20&!
2*!$%&!3%(77&#8&)!46)3/))&4!(125&:!
8H5H8HI/HJ(
[1] M76.!>:P:.!Y&-&X<k/6#2#&).!>:.!(#4!M1-(0).!>:.!
`/6746#8!>/7$6<Y7($*2-0!A)&-!"#$&-*(3&)!+6$%!A">R:!"#!
U&**(%.!M:.!l(5(%&-,.!E:!@T4):B.!>/7$6'7&!A)&-!
"#$&-*(3&)H!G-2))<Y7($*2-0!M''763($62#)!(#4!G2#$&=$<
M+(-&!"#$&-*(3&):!l2%#!c67&,!m!U2#).!R$4.!G%:!V.!JKKj.!
'':!SFNOOL:!
[2]
`7/&$22$%.!%$$'HII+++:17/&$22$%:320I17/&$22$%I:!!
[3] `&-$6.!U:.!G2--&(#6.!P:.!Y($&-#n.!P:.!(#4!U(#$2-2.!G:.!
C%&!CTQTUM!f>R!R(#8/(8&!*2-!$%&!D&)3-6'$62#!2*!
"#$&-(3$65&!U,)$&0)!($!>/7$6'7&!M1)$-(3$62#!R&5&7):!6#!
Y-23&&46#8)!2*!$%&!MG>!M]"gJKKj!c2-9)%2'!
oD&5&72'6#8!A)&-!"#$&-*(3&)!+6$%!f>RH!M45(#3&)!2#!
A)&-!"#$&-*(3&!D&)3-6'$62#!R(#8/(8&)o.!JKKj.!'':OK[<
OOK:!
[4] D686$(7!R656#8!b&$+2-9!M776(#3&.!
%$$'HII+++:47#(:2-8:!!
(
!
[7] EM]6.!$%&!MI]!4686$(7!#&$+2-9!-&527/$62#!
%$$'HII+++:%(56:2-8I'4*I+%6$&:'4*:!
[8] R6012/-8.!k:.!](#4&-42#39$.!l:.!>63%2$$&.!`:.!
`2/6772#.!R:.!P72-6#).!>:.!C-&56)(#.!D:.!A)6f>RH!M!A)&-!
"#$&-*(3&!D&)3-6'$62#!R(#8/(8&!*2-!G2#$&=$<U&#)6$65&!
A)&-!"#$&-*(3&).!6#!Y-23:2*!$%&!MG>!M]"gJKKj!c2-9)%2'!
oD&5&72'6#8!A)&-!"#$&-*(3&)!+6$%!f>RH!M45(#3&)!2#!
A)&-!"#$&-*(3&!D&)3-6'$62#!R(#8/(8&)o.!JKKj.!'':!FF<
VJ:!
[9] R6&1&-0(#.!E:!(#4!T)'6#2)(.!l:!E:.!M!82(7<2-6&#$&4!
6#$&-*(3&!$2!32#)/0&-!&7&3$-2#63)!/)6#8!'7(##6#8!(#4!
32002#)&#)&!-&()2#6#8:!"#!Y-23:!"A"!JKKV.!''!JJV<
J[[:!
[10] C%20')2#.!c:.!(#4!`76)).!E:.!M!D&37(-($65&!
P-(0&+2-9!*2-!1/6746#8!G20'2)6$62#(7!D6(728!>24/7&).!
6#!Y-23:!"GURY!JKKK:!
[11] C2+(-4)!$%&!#&=$!167762#!)/1)3-61&-)H!>2$2-27(!
4&765&-)!2#!)&(07&))!021676$,!56)62#!
%$$'HII+++:02$2-27(:320I0&46(3&#$&-!
#&+)I4&$(67:d)'q8721(7;1d&3$"4rVjKFsV[FFsJ[I:!
[12] b63%27).!l:.!(#4!>,&-).!`:.!(#4!Q2$%-239.!`:.!
Ab"P;Q>H!M/$20($63(77,!e&#&-($6#8!G2#)6)$&#$!Q&02$&!
G2#$-27!A)&-!"#$&-*(3&).!Y-23:!GE"!JKKV:!
[13] A#65&-)(7!Y7/8!(#4!Y7(,.!%$$'HII+++:/'#':2-8:
Discourse-based Interaction Design for
Multi-modal User Interfaces
Cristian Bogdan
Hermann Kaindl
School of Computer Science
Institute of Computer Technology
and Communication
Vienna University of Technology
Royal Institute of Technology
A-1040 Vienna, Austria
10044 Stockholm, Sweden
kaindl@ict.tuwien.ac.at
cristi@csc.kth.se
and
Jürgen Falb
Institute of Computer Technology Institute of Computer Technology
Vienna University of Technology
Vienna University of Technology
A-1040 Vienna, Austria
A-1040 Vienna, Austria
bogdan@ict.tuwien.ac.at
falb@ict.tuwien.ac.at
Abstract
Current user interfaces do not sufficiently utilize
multiple modalities. We developed a new approach to
modeling discourse-based interaction design inspired by
theories of human communication. From such an
interaction design, we envisage to generate a multimodal user interface. This paper presents our approach
in the context of mixed-initiative interactions with a
(semi-)autonomous robot.
Introduction
In previous work [2] we studied several theories of
human communication from various fields to develop
an approach for specifying discourse-based interaction
design models. These design models are more
understandable and possibly easier to build for humans
with less technical background than user-interface
models. Based on such an approach, we showed in [1]
how graphical user interfaces can be rendered from
high-level models.
Since the concepts of human communication are
applicable to different modalities, we strive for
rendering multi-modal interfaces that support mixedinitiative. As a benefit, modelers do not need to care
about modality while specifying the interaction design.
During rendering the system will suggest one or more
modalities that a particular part of an interaction should
be performed in. The modeler is still able to influence
this decision making. This process should ease the
development of multi-modal mixed-initiative interfaces
for modelers, since they only have to specify one
discourse-based interaction for all modalities.
Copyright is held by the author/owner(s).
CHI 2008 Workshop: User Interface Description Languages for Next
Approach description
Generation User Interfaces, April 5 – 10, 2008, Florence, Italy
Our approach to multimodal communication consists of
two distinct stages: the creation of the interaction
ACM 1-xxxxxxxxxxxxxxxxxx.
2
figure 1. The discourse model
model, which is modality-neutral, and the rendering
where the modeller and possibly other people can
assist the system in improving the interface by placement (spatial or temporal) of components within the
constraints of the interaction model, choice of modality,
etc. First we focus on the modality-neutral interaction
design stage.
We describe our approach to model multimodal communication of humans with (semi-)autonomous robots
through an example of a shopping trolley robot that
helps the customer to process a predefined shopping
list and to find items in a supermarket environment.
Through the explanation we emphasize the concepts of
human communication that our approach is inspired
from. We have modelled (part of) an example interac-
tion in figure 1 according to our discourse modelling
approach.
A typical scenario covered by the discourse illustrated
model goes as follows: First, either the trolley asks the
customer to select a product from the shopping list to
go to next, or the customer directly requests the trolley
to go to yet another product in the supermarket. After
specifying the next product, the robot shopping trolley
starts moving to the indicated destination together with
its assigned customer. When they get to the requested
product, the trolley informs the customer about the
arrival and removes the product from the shopping list.
Our models describe classes of dialogues or scenarios,
respectively, in a primarily declarative way. So, this
model also includes e.g., that the customer can redirect
3
the shopping trolley at any time, through requesting a
new product as the current destination.
In the first step of the above scenario, the specification
of a product can be accomplished in two different ways,
either the trolley asks where to go, or the user requests
to go somewhere. The two alternatives are an example
of how our modelling framework can accommodate
mixed-initiative interaction. We model these alternatives as two adjacency pairs (inspired from Conversation Analysis, details on the human communication
concepts and the modelling language can be found in
[1, 2]). These adjacency pairs are grouped together
with a rhetorical relation (inspired from Rhetorical
Structure Theory (RST)). All our models are, in fact,
trees with adjacency pairs as leaves and rhetorical
relations as the other nodes. In this case, since the two
alternatives are of equal “weight”, the adjacency pairs
are grouped with an “Otherwise” RST relation, which is
meant for such cases.
Indicating a destination from the part of the user is
modelled at the bottom-centre, in the form of a
“Request” communicative act inspired from Speech Act
Theory. Communicative acts offer us an abstraction
that is graphical-toolkit-neutral and also modalityneutral. Adjacent to the request, there is an “Accept”
communicative act with which the machine confirms
the new destination. The left side of the model offers
the collected destinations for the user to choose from.
This is modelled as a “Closed Question” communicative
act, to which the user can respond, by way of the
adjacent “Answer” communicative act, to choose from a
defined list of possibilities. This list of possibilities is
called propositional content in Speech Act Theory, and
in our approach it is provided and refreshed by the
application logic of the robot trolley. The “Closed
Question” also helps the user to keep updated on what
the shopping items are that were already added to the
shopping list but not yet processed.
If there is no further user interaction and the robot
reaches the destination currently specified, it informs
the user about the status via the “Informing” communicative act at the right of our model, and the destination
is removed from the shopping list by the robot’s application logic. Since this is the main result of the interaction, the “Informing” is linked to the remainder of the
dialogue model through a “Result” RST relation.
Multimodal Communication with a Robot
according to this Model
Now let us focus on the rendering stage where the
communication platform software will have to deal with
modalities for expressing and receiving communicative
acts. It is designed to do so based on heuristics, but
the modeller and possibly other people may assist in
choosing one or multiple modalities for improving the
interface. Our robot trolley is designed to support three
communication modalities and their combination:
graphical interaction through a touch screen, speech
input/output and movement.
Since the “Request goto product” communicative act is
modelled to give the application logic data of a certain
type (let’s call it destination) and a speech input of type
destination is available from the speech recognition, the
render engine will recognize that the “Request” can be
rendered in speech input. While assisting the rendering
process, the modeller can decide that the request can
also be done via the touch screen, in which case e.g., a
widget providing alphabetical search for destinations
4
can be rendered. Furthermore, our communication
platform software can decide at runtime to fall back to
the graphical input in a very noisy environment. In the
case of accepting a “Goto Request”, the trolley will
utter the Accept communicative act in e.g., speech,
since using the same modality for the adjacent
communicative act improves clarity and answers the
user's expectation.
The render engine will, in principle, render the “Closed
Question” with the shopping list items only on the
touch screen, as the speech medium is an expensive
resource for a list. However, if desired at the rendering
stage, speech could also be used in this case. This
could be based e.g., on the level of ambient sound i.e.,
if the user appears to be alone in the shop, there is
more "rendering space". And, maybe after periods of no
communication with the user although sensed to be in
the robot’s proximity, the “Closed Question” can be
uttered in speech as a suggestion. The user can
interrupt the utterance via a predetermined speech
utterance, to indicate that she chose the last destination uttered by the robot speech synthesis. In previous
work, we have used the rendering space as a constraint
for model rendering in GUI interfaces, but as exemplified here, a similar temporal constraint can be used for
the speech modality.
Discussion and Conclusion
Our approach can be regarded as a very-high-level user
interface definition language, or more precisely an
interaction design language. We envisage that from the
communicative acts, rhetorical relations and
conversation analysis patterns it employs, a decent
multi-modal interface can be generated. If a pre-
rendering stage is added, our render engine will get
even more guidelines for its runtime heuristics,
resulting in higher-quality interfaces.
We have also shown that although our interaction
models are modality-neutral, the modality can be
inferred from the model in multiple ways: from data
types involved, from the availability of widgets for the
respective modality, from quasi-spatial constraints in
the modality, and not the least from the “importance”
of a certain communicative act as conveyed by the
rhetorical structure of the discourse. If such inferences
do not suffice, based on the interaction model, our
system will be able to guide the modelers and
designers to specify the modality of certain
communicative acts.
Acknowledgements
We would like to thank Helge Hüttenrauch and
Alexander Szep for their reviews and valuable
comments.
References
[1] Bogdan, C., Falb, J., Kaindl H., Kavaldjian S., Popp
R., Horacek H., Arnautovic E. and Szep A. Generating
an Abstract User Interface from a Discourse Model
Inspired by Human Communication. In Proc. 41th
Annual Hawaii Int’l Conference on System Sciences
(HICSS-41), IEEE Computer Society Press (2008).
[2] Falb J., Kaindl H., Horacek H., Bogdan C., Popp R.,
and Arnautovic E. A discourse model for interaction
design based on theories of human communication. In
Ext. Abstracts CHI 2006, ACM Press (2006), 754-759.
Dealing with Reliability and Evolvability in
Description Techniques for Next Generation
User Interfaces
Jean-François Ladry
LIIHS-IRIT, University Paul
Sabatier , 31062 Toulouse Cedex
ladry@irit.fr
Philippe Palanque
LIIHS-IRIT, University Paul
Sabatier , 31062 Toulouse Cedex
palanque@irit.fr
Sandra Basnyat
LIIHS-IRIT, University Paul
Sabatier , 31062 Toulouse Cedex
basnyat@irit.fr
Abstract
This position advocates that reliability and evolvability of
interactive systems are critical properties as far as safety
critical systems are concerned. We present a model-based
environment called PetShop for the edition, simulation,
validation and verification of interactive systems targeting at
reliable and evolvable systems. The use of the description
technique (the ICO formalism) supported by PetShop is
presented on a multimodal ground segment application for
satellite control.
Keywords
Multimodal interfaces, formal description techniques, reliability,
evolvability
Eric Barboni
LIIHS-IRIT, University Paul
Introduction
Sabatier , 31062 Toulouse Cedex
The design of description techniques for next generation user
interfaces face a recurrent dilemma. Should the description
technique:
David Navarre
LIIHS-IRIT, University Paul
Sabatier , 31062 Toulouse Cedex
navarre@irit.fr
•
focus on new interaction techniques, new interaction
languages and new input devices (as in recent work such
as [16] for tangible interfaces)
•
or should the description technique go beyond proof-ofconcept aspects and also support the construction of “real”
industry-related systems?
Copyright is held by the author/owner(s).
CHI 2008, April 5 – 10, 2008, Florence, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
Many description techniques presented at the CHI or UIST
conferences focussed on the first option. The work presented in
2
the rest of this paper focuses on the second part of the
alternative
i.e.
description
techniques
supporting
the
development of next generation interfaces in the safety-critical
domain.
released” creates “button click”, a click on a graphical object
and and finally, the call of a function).
An Architecture for Dealing with Multimodal
Interfaces
To deal with WIMP and post-WIMP interaction techniques,
several notations have been proposed from Data-flow-based
notations such as Wizz’ed[5], Icon[4], Nimmit[9] or InTml[10]
to event-based notations such as Marigold[8], Hynets[7] or
ICO[6]. Hybrid models integrating both event-based and dataflow-based notations have also been presented in [1] and in
[15]. This position paper extends the work presented in [15] by
removing the data-flow model dealing with input devices
configuration and proposes a single event-based notation
described in the next section.
A Petri net based Description Language
ICOs (Interactive Cooperatives Objects)
description technique dedicated to the
construction of interactive applications.
are a formal
modeling and
ICOs are able to describe:
•
The behaviour of the system with a set of CO-Classes
•
A graphical view of the system
•
The link between the system-behaviour aspect and the
graphical aspect with:
o
the rendering function which maintains the
consistency between the state of the CO-Class
and its appearance
o
the activation function which shows how the
system reacts to inputs of the user.
ICOs can be used to describe the behaviour of interactions from
the input device event (such as a button pressed event) to a
high level event in the dialog part of a system (a call to a high
level function). From this simple event to the highest, we can
represent the increase of the information obtained in
comparison to computerization of sequences of events (for
example, a sequence of a “button pressed” and “button
Figure 1. Arch representation of our models.
Figure 1 presents the proposed multimodaility models on the
Arch [3] model for two devices. The various layers handle
events at a different level of abstraction from lower-level
events (directly produced by input devices) to the higher-level
of the dialogue model (representing the states of the system
and the impact of events on the state changes). The first level
(physical interactions) contains ICO models handling the basic
input events from the input devices. The second level contains
models which manage higher level events (such as clicks, …).
The third level handles the connection between the events and
the graphical windows (producing real graphical coordinates).
The higher level is responsible for the connection between
events and the graphical objects. This architecture is generic
and has been used for the formal description of many
interaction techniques including gesture, voice and two-handed
interaction. The next section presents an instantiation of this
generic architecture for a two-handed interaction.
3
Application of the Architecture for Two-handed
interactions
Figure 1) is the management of the absolute coordinate
(AbsoluteCoord). This model (not presented in this paper)
computes the different events of movements and adds the
information of the absolute coordinate for all the events to the
next level. At the last level, (see Figure 1) is the object
management (ObjectPicking) which compares the position
and the event from a mouse with the list of the objects from
the interface (not presented in this paper). This level is linked
to the interface and call function.
This section refines the previous architecture for two mice
interaction (see Figure 2). The four levels introduced are
refined to present in detail how they handle events and how the
description techniques support the modelling activity. However,
due to space constraints, we do not include here all the models.
Instead, we will focus on the models at the second level:
MouseClick & MouseDragging.
When the model MouseClick [Figure 3] receives a
mousePressed event, a token is set into the place
buttonPressed and the transition pressed is fired. A token
corresponding to the reference of the mouse and the button
pressed is then set into the down place. Upon the reception of a
mouseReleased event a token is set into the place
buttonPressed. As there is a token corresponding to the same
mouse in place down and in place buttonreleased, the transition
click is fired. This firing raises a click event to the upper layer
and sets a token in the initial place Init.
In the initial state (a token in place idle see Figure 4) when the
model MouseDragging receives an event mouseMove, it sends
the event to the upper level model. If the model receives a
buttonPressed event, a token is set in the place down and the
next mouseMove event will move that token to the place
Dragging. For each of these mouseMove events performed after
the occurrence of a butonPressed event, a mouseMovePressed
event is sent to the upper level1. This model also shows the
possibility of dragging with one or more buttons pressed. The
end of the drag is done when all the buttons are released, the
AllReleased transition is fired and the token is deposited in the
initial state Idle.
Figure 2. Representation of event exchange
At the first level, (bottom of Figure 1) are the MouseDrivers
which capture the simple events from the mouse and send
them to the next layer. At the second level, (see Figure 1)
there are two models (dragging and click) for each mouse. The
first model (representing the click), called MouseClick, raises
click events from release and pressed events from the mouse.
This model is shown in Figure 3. The second model
(representing dragging), called MouseDragging, creates
events with information relating to motion. This model is shown
in Figure 4. There are two events in the MouseDragging model,
one for a simple mouse move when there is no button pressed
(MouseMove) the other represents a mouse move when a
button is pressed (MouseMovePressed).. The third level, (see
1
For readability purposes, the production of events is not presented in
the models (this would appear as text of the transition). However, the
set of events produced by each model is represented in Figure 2
4
As this was the case for traditional WIMP intereaction
techniques (see [2]), all of these models are created and
executed in PetShop. PetShop features a ICOs editor. At run
time, the designer can modify models to change the behaviour
of the application.
A Safety Critical Command and Control Multimodal
Application
“AGENDA” is an interactive application which allows ground
segment operators to manage a satellite. It allows receiving
telemeasures (TM) from the satellite (typically the information
that has been gathered by the satellite) and sending
telecommands (TC) (typically a set of actions the satellite will
have to perform for a given period of time).
The main focus of this project is the integration of new
technologies to increase efficiency and comfort of operators and
also to reduce errors that can have a catastrophic impact on
the satellite. For instance, not providing the right TC on time
might result in a change of orientation of the satellite and if the
solar panels are no longer in line with the sun the satellite may
move into a survival mode and thus not be able to perform its
tasks anymore.
Figure 3. The ICO model receiving low-level events (moussePressed and mouseReleased) and producing MouseClicks
5
Figure 4. The ICO model receiving low-level events (moussePressed, mouseReleased and mouseMove) and handling mouseDragging
The use of multimodality in this application allows mastering
the increasing complexity of such systems as, for instance, new
systems such as Galileo will be composed of a constellation of
satellites requiring more input from the operators within shorter
periods of time. Some TCs are only triggered if the output of a
previous TC is satisfactory. Such TCs are called conditional TCs.
Figure 5 represents the adding of a conditional TC to another
TC. This adding can be performed using two mice and the
interaction technique called « click-through ». The window for
adding a procedure has a transparent border allowing selection
of a procedure through the windows with a mouse while the
other mouse moves the window. Such interaction techniques
can be modelled using ICOs and will impact models at the
ObjectPicking level of the architecture (See Figure1).
6
Figure 5. Two-handed ClickThrough in AGENDA
Testing
Model-based approaches [2] featuring formal description
techniques can provide support to developers in the later
phases of the development process where the intrinsic nature
of interactive systems makes them very hard to address
properly otherwise. For instance, the event-based nature of
interactive systems makes them impossible to test without tool
support for the generation of the possible test cases. Work in
this field is still preliminary but contributions are available for
WIMP interaction techniques providing means for regression
testing [13] and coverage criteria [14].
Benefit of the Approach
The reason for focusing on the use and the deployment of
formal description techniques lies in the fact that they are the
only mean to address both modeling in a precise and
unambiguous way all of the components of an interactive
application (presentation, dialogue and functional core) and to
propose techniques for reasoning about (and also verifying) the
models.
Applying formal description techniques can be beneficial during
the various phases of the development process from the early
phases (requirements analysis and elicitation) to the later ones
including evaluation (testing).
Verification
Verification techniques aim at providing ways for ensuring
systems reliability prior to implementation. Description
techniques that provide such capabilities would empower
developers by offering means for reasoning about their systems
at a higher level of abstraction. For instance, verification
techniques over a formal description technique make it possible
to assess properties such as: whatever state the system is in,
at least one interactive object is enabled, mutual exclusion of
actions, reachability of specific states [11].
Prototyping
Prototyping is now recognized as a cornerstone of the
successful construction of interactive systems as it allows
locating users at the centre of the development process.
However, iterative or incremental prototyping tends to produce
low quality software as no specification or global design and
understanding is undertaken. Description techniques should be
able to support this incremental and iterative prototyping
activity by making as simple as possible for developers to
modify their prototypes. One issue there is to support the
compatibility requirement ensuring that a refinement of a
previous prototype is still compliant with the new version of the
prototype.
Certification
Certification is a phase of the development process specific to
safety critical systems. Developers of such systems are in
charge of demonstrating that the system is ‘acceptably safe’
before certification authorities grant regulatory approval.
Dealing with certification calls for description techniques able to
provide support for ensuring the reliability of the interactive
application.
Conclusion
This position paper presents the use of the ICO notation for the
description of multimodal interaction. This work was carried out
within an industrial project dealing with satellite ground
7
segments (i.e. interactive applications deployed in satellite
control rooms). The paper has superficially introduced the
description technique ICOs and its tool-support environment
PetShop. During the presentation a demonstration of the tool
and the application will be performed to present in detail how
the models can be interactively modified at run time to support
the evolution of interaction techniques.
References
Raymaekers, Karin Coninx, International Conference on
Computer Graphics Theory and Applications, 2006, Portugal
[1] Jacob R. "A Software Model and Specification Language for
Non-WIMP User Interfaces." ACM Transactions on ComputerHuman Interaction 6, n°. 1, 1-46, (1999).
[10] Pablo Figueroa, Mark Green, and H. James Hoover, InTml:
A Description Language for VR Applications, Proceedings of
Web3D’02 (Arizona, USA), 2002, pp. 53–58.
[2] Navarre, David, Palanque, Philippe, Bastide, Rémi. A ToolSupported Design Framework for Safety Critical Interactive
Systems in Interacting with computers, Elsevier, Vol. 15/3, pp
309-32, (2003).
[11] Palanque P. & Bastide R. Verification of an Interactive
Software by analysis of its formal specification Proceedings of
the IFIP Human-Computer Interaction conference (Interact'95),
Norway., 27-29 June 1995, p. 191-197
[3] Bass L., Pellegrino, R., Reed, S., Seacord, R., Sheppard, R.,
and Szezur, M. R. The Arch model: Seeheim revisited.
Proceeding of the User Interface Developers' workshop. 91.
[12] Bodart
F.,
Hennebert
A.-M,
Leheureux
J.-M.
&
Vanderdonckt J. "Encapsulating Knowledge for Intelligent
Automatic Interaction Objects Selection." in Human Factors in
Computing Systems INTERCHI'93, 424-29, (1993).
[4] P. Dragicevic & J-D. Fekete. (2001) Input Device Selection
and Interaction Configuration with ICON. Proceedings of IHMHCI 2001, People and Computers XV - Interaction without
Frontiers, Springer Verlag, pp. 543-448.
[5] O. Esteban, S. Chatty, and P. Palanque. Whizz’Ed: a visual
environment for building highly interactive interfaces.
Proceedings of the Interact’95 conference, 121-126.1995.
[6] P. Palanque & A. Schyn. A Model-Based Approach for
Engineering Multimodal Interactive Systems in INTERACT 2003,
IFIP TC 13 conference on Human Computer Interaction.
[7] R. Wieting 1996. Hybrid High-Level Nets . Page 848 855
Proc. of the 1996 Winter Simulation Conference. ACM Press.
[8] J.S. Willans & Harrison M. D. Prototyping preimplementation designs of virtual environment behaviour. 8th
IFIP Working conference on engineering for human computer
interaction (EHCI'01) 2001. LNCS, Springer Verlag.
[9] "NiMMiT: a Notation for Modelling Multimodal Interaction
Techniques"
Davy
Vanacken,
Joan
De
Boeck,
Chris
[13] Memon A. M. & Soffa M. L. Regression testing of GUIs. In
Proceedings of the 9th European software engineering
conference held jointly with 10th ACM SIGSOFT international
symp. on Foundations of soft. Eng. pp. 118 - 127 2003
[14] Memon A. M., Soffa M. L. Pollack M.E. Coverage criteria for
GUI testing. In proceedings of the 8th European software
engineering conference held jointly with 9th ACM SIGSOFT int.
symp. on Foundations of soft. Eng.. pp 256 - 267 2001
[15] NavarreD., Palanque P., Dragicevic P. & Bastide R.. An
Approach Integrating two
Complementary Model-based
Environments for the Construction of Multimodal Interactive
Applications. Interacting with Computers, vol. 18, n°5, pp. 910941, (2006).
[16] A. Girouard, E.T. Solovey, L. Hirshfield, S. Ecott, O. Shaer,
and R.J.K. Jacob, “Smart Blocks: A Tangible Mathematical
Manipulative,” Proc. TEI 2007 First International Conference on
Tangible and Embedded Interaction, 2007.
!
!"#$#$%&'()*+,-$'.#/0-*1($2"30425*6'$7*
$72*8+91+:*+#/2-'()*:0(),0)2
!
!
@%;4-(A;<&#(
!
9#&C+,'&$2!)3!M,&F)5,0!
X)56+C*,1!1+!YZ,)66+'![Q!
NRQQ!M,&F)5,0-!7.&$<+,6*#1!
F,5#)B154*'\5#&3,B/%!
!
A*42#(8&1&44*(
9#&C+,'&$2!)3!M,&F)5,0!
X)56+C*,1!1+!YZ,)66+'![Q!
NRQQ!M,&F)5,0-!7.&$<+,6*#1!
1+#&'B6*6*##+\5#&3,B/%!
!
!"#$%&'$(
"#!$%&'!()'&$&)#!(*(+,-!.+!(,+'+#$!*#!*((,)*/%!0+*,+1!
$).*,1!,*(&1!(,)$)$2(&#0!)3!456$&4)1*6!&#$+,3*/+'!.&$%!
789"8:!;72#/%,)#&<+1!856$&4)1*6!9'+,!"#$+,3*/+'!
8*,=5(!:*#05*0+>!*#1!?+(%*&'@A!$))6=&$B!@%+!0)*6!)3!
789"8:!*#1!?+(%*&'@A!&'!$)!)33+,!1+C+6)(+,'!.&$%D!)#!
)#+!'&1+-!*!6*#05*0+!*66).&#0!/6+*,!1+'/,&($&)#!)3!
%54*#E4*/%&#+!456$&4)1*6!1&*6)0!*#1!/)#$,)6!)C+,!$%+!
.*2!456$&(6+!&#(5$!4)1*6&$&+'!%*C+!$)!F+!35'+1G!*#1!)#!
$%+!)$%+,!'&1+-!*#!+H$+#'&F6+!$))6!&4(6+4+#$&#0!*#1!
4*#*0&#0!$%&'!1&*6)0B!
B-13(74>-1.(
)*+,-%.#(
9#&C+,'&$2!)3!M,&F)5,0!
856$&4)1*6!&#$+,3*/+'-!456$&4)1*6!1&*6)0-!5'+,E4*/%&#+!
1&*6)0!1+'/,&($&)#B!
X)56+C*,1!1+!YZ,)66+'![Q!
NRQQ!M,&F)5,0-!7.&$<+,6*#1!
,)63B&#0)61\5#&3,B/%!
!/0(/1&##232'&$2-4()*+,-%.#(
!
IBJBJB!I+'&0#!@))6'!*#1!@+/%#&K5+'D!9'+,!"#$+,3*/+'B!
5067089(:;<&4=<&':24*(.2&1->(&4.(
<-.&12$2*#(#+4':%-42?&$2-4(
O)(2,&0%$!&'!%+61!F2!$%+!*5$%),P).#+,;'>B!
O?"!JQQR-!S(,&6!JT!U!8*2!V-!JQQR-!7*#!W)'+-!97S!
SO8!NEHHHHHHHHHHHHHHHHHHB!
789"8:!%*'!F++#!1+'&0#+1!3,)4!$%+!0,)5#1!5(!*'!*!
L8:!6*#05*0+!*F6+!$)!1+'/,&F+!,&/%!&#$+,*/$&)#'!
F+$.++#!%54*#!*#1!/)4(5$+,B!"#!(*,$&/56*,-!&$!
1+'/,&F+'!+C+,2!&#$+,*/$&)#!F+$.++#!$%+!5'+,!*#1!$%+!
4*/%&#+!*$!$%,++!1&33+,+#$!6+C+6'-!*'!'%).#!&#!M&05,+!ND!
$%+!6).+,!6+C+6!+#*F6+'!$%+!1+C+6)(+,!5'&#0!789"8:!$)!
J!
!
1+'/,&F+!$%+!1&33+,+#$!4)1*6&$&+'!*#1!*'')/&*$+1!
,+/)0#&<+,'-!*'!.+66!*'!1+'/,&($&)#!)3!&#/)4&#0!
C*,&*F6+'B!@%+!4&116+!6+C+6!)3!789"8:!1+'/,&F+'!
&#/)4&#0!+C+#$!$,&00+,'!*#1!)5$0)&#0!*/$&)#'-!(+,!
4)1*6&$2B!M&#*662!$%+!%&0%+,!6+C+6!)3!789"8:!1+'/,&F+'!
$%+!*/$5*6!1&*6)0!F+$.++#!$%+!5'+,!*#1!$%+!4*/%&#+!F2!
4+*#'!)3!*!3&#&$+!'$*$+!4*/%&#+B!72#/%,)#&<*$&)#!)3!
/)#$+#$!&'!*/%&+C+1!&#!*66).&#0!1+C+6)(+,'!$)!+H(,+''!
&#(5$!4)1*6&$2!,+6*$&)#'%&('!.&$%!$%+!1&33+,+#$!OS]^!
(,)(+,$&+'!;/)4(6+4+#$*,&$2-!*''&0#4+#$-!,+15#1*#/2-!
+K5&C*6+#/+>!(,+'+#$+1!&#!_`aB!
<triggers>
<trigger name="operation">
<source modality="speech" value="erase shape
| rotate shape | move shape"/>
</trigger>
</triggers>
<actions>
<action name="draw_action">
<target name="xpaint_client" message="draw
$oper $shape $posx $posy"/>
</action>
</actions>
<dialog>
<context name="modification">
<transition name="modif_clause">
<par_and>
<trigger name="operation"/>
<trigger name="selected shape"/>
<trigger name="position"/>
</par_and>
<result action=" draw_bidule"/>
</transition>
<transition>
<trigger name="return"/>
<result context="start"/>
</transition>
</context>
</dialog>
</integration_description>
</muiml>
!
32>;%*(CD!@%+!$%,++!6+C+6'!)3!789"8:B!
S!'4*66!&#'$*#/+!)3!*!789"8:!'/,&($!&'!'%).#!F+6).G!
$%&'!&#'$*#/+!&'!+H$,*/$+1!3,)4!*!5'+!/*'+!1+'/,&F&#0!*!
1,*.&#0!$*F6+-!.&$%!$*#0&F6+!*#1!'(++/%!&#(5$B!!
<?xml version="1.0" encoding="UTF-8"?>
<muiml>
<integration_description client="xpaint_client">
<recognizers>
<recognizer name="reactivision">
<variable name="posx" value="x" type="int"/>
<variable name="posy" value="y" type="int"/>
</recognizer>
</recognizers>
789"8:!'++='!$)!F5&61!)#!$%+!=#).6+10+!)3!(,+C&)5'!
*$$+4($'!*$!/,+*$&#0!456$&4)1*6!&#$+,*/$&)#!1+'/,&($&)#!
6*#05*0+'-!.%&6+!'$*2&#0!'&4(6+!*#1!+H(,+''&C+B!8)'$!)3!
$%+!*((,)*/%+'!(,+'+#$+1!F+6).!,+C)6C+!*,)5#1!$%+!
/)#/+($!)3!*!b456$&4)1*6!.+Fc-!+#3),/+1!F2!$%+!d),61!
d&1+!d+F!O)#'),$&54!;dVO>!856$&4)1*6!"#$+,*/$&)#!
S/$&C&$2!*#1!$%+!(,)()'+1!456$&4)1*6!*,/%&$+/$5,+B!@%+!
.),=!)3!$%+!dVO!&#'(&,+1!A*$'5,*1*!+$!*6B!3),!$%+&,!.),=!
)#!$%+!L"7:!L8:!6*#05*0+!_eaB!L"7:!3)/5'+'!)#!
'2#/%,)#&<*$&)#!)3!456$&4)1*6!&#(5$!*#1!)5$(5$-!*'!.+66!
*'!1&*6)0!36).!*#1!$,*#'&$&)#B!S'!'5/%-!89"8:!*#1!L"7:!
3)66).!*!/)44)#!0)*6-!F5$!789"8:!$+#1'!$).*,1!*!
'$,)#0+,!C+,'*$&6&$2!*#1!,+*1*F&6&$2B!S#)$%+,!*((,)*/%!&'!
$%+!)#+!)3!S,*=&!+$!*6B!_Na-!.%)!(,)()'+!8"8:!
V!
!
;856$&4)1*6!"#$+,*/$&)#!8*,=5(!:*#05*0+>B!f#+!)3!$%+!
=+2!/%*,*/$+,&'$&/'!)3!$%&'!6*#05*0+!&'!&$'!$%,++E6*2+,+1!
1+'/,&($&)#!)3!&#$+,*/$&)#-!3)/5'&#0!)#!&#$+,*/$&)#-!$*'='!
*#1!(6*$3),4B!S!'&4&6*,!$%,++E6*2+,+1!*((,)*/%!%*'!
F++#!3)66).+1!3),!89"8:-!F5$!.&$%!*!'$,)#0+,!*//+#$!)#!
4)1*6&$2!,+6*$&)#'%&('!*#1!'2#/%,)#&<*$&)#B!M&#*662-!
7$*#/&56+'/5!+$!*6B!_Ta!3)66).+1!*!$,*#'3),4*$&)#*6!
*((,)*/%!3),!1+C+6)(&#0!456$&4)1*6!.+F!5'+,!&#$+,3*/+'!
F*'+1!)#!9'&L8:-!*6')!&#!$%+!'$+('!)3!$%+!dVOB!M)5,!
'$+('!*,+!*/%&+C+1!$)!0)!3,)4!*!0+#+,&/!4)1+6!$)!$%+!
3&#*6!5'+,!&#$+,3*/+B!@%5'-!)#+!)3!$%+!4*&#!3+*$5,+'!)3!
$%+&,!.),=!&'!*!'$,)#0!&#1+(+#1+#/+!$)!$%+!*/$5*6!&#(5$!
*#1!)5$(5$!*C*&6*F6+!/%*##+6'B!X5$!$%&'!C+,'*$&6&$2!&'!*$!
$%+!/)'$!)3!%+*C2!(,+(,)/+''&#0B!!
$)!$%+!()'$4*#-!.%&/%!.&66!*//),1&#062!,+1&'$,&F5$+!
,+/+&C+1!4+''*0+'B!M5'&)#!)3!&#(5$!4)1*6&$&+'!&'!
*/%&+C+1!$%,)50%!4+*#&#0!3,*4+'B!d%+#!*!W*C*!
1+C+6)(+,!.*#$'!$)!5'+!?+(%*&'@A!$))6=&$!$)!4)#&$),!
456$&4)1*6!&#(5$'!&#!&$'!*((6&/*$&)#-!%+!%*'!$)!1+/6*,+!
$%+!$))6=&$!F2!4+*#'!)3!+C+#$!6&'$+#+,'B!@%+!35'&)#!*#1!
1&*6)0!4*#*0+,'!)3!?+(%*&'@A!*,+!'/,&($+1!F2!4+*#'!)3!
*!789"8:!3&6+B!!
D*E:&2#F)G(&($--1H2$(;#24>(506708(
?+(%*&'@A-!*!$))6=&$!5'&#0!789"8:!4)1+66&#0!6*#05*0+-!
&'!&#$+#1+1!$)!F+!*!$))6=&$!*66).&#0!,*(&1!/,+*$&)#!)3!
456$&4)1*6!&#$+,3*/+'-!)33+,&#0!*!(,+1+3&#+1!'+$!)3!
,+/)0#&<+,'!*'!.+66!*'!$%+!()''&F&6&$2!$)!(650!&#$)!$%+!
$))6=&$!*#2!)$%+,!4)1*6&$2!,+/)0#&<+,-!*'!6)#0!*'!&$!
/)4(6&+'!.&$%!*!0&C+#!'+$!)3!/)#1&$&)#'-!+B0B!
/)445#&/*$&)#!.&$%!$%+!$))6=&$!F2!4+*#'!)3!$%+!dVO!
^88S!6*#05*0+B!"#!$%+!35$5,+-!?+(%*&'@A!.&66!*6')!)33+,!
1&33+,+#$!35'&)#!4+/%*#&'4'!$)!*66).!4+*#&#0!3,)4!
&#/)4&#0!,+/)0#&<+,'!$)!F+!+H$,*/$+1-!*#1!(*''+1!$)!
()$+#$&*6!/6&+#$!*((6&/*$&)#'B!!
"#!&$'!/5,,+#$!'$*$+-!?+(%*&'@A!&'!F5&6$!5()#!*!')3$.*,+!
*0+#$!'2'$+4B!^*/%!$&4+!*!#+.!,+/)0#&<+,!),!
'2#$%+'&<+,!&'!(6500+1!&#$)!$%+!$))6=&$-!*#!*0+#$!&'!
1&'(*$/%+1!$)!4)#&$),!&$B!?+(%*&'@A!5'+'!*!/+#$,*6!
F6*/=F)*,1!*,/%&$+/$5,+!;'++!M&0B!J>B!S!b()'$4*#c!
/+#$,*6&<+'!+*/%!4+''*0+!/)4&#0!3,)4!$%+!1&33+,+#$!
&#(5$!,+/)0#&<+,'!*#1!'$),+'!&$!&#$)!*!1*$*F*'+B!S0+#$'!
&#$+,+'$+1!&#!*!'(+/&3&/!$2(+!)3!4+''*0+!/*#!'5F'/,&F+!
!
32>;%*(ID!?+(%*&'@A!$))6=&$!*,/%&$+/$5,+B!
f$%+,!,+'+*,/%+,'!&#C+'$&0*$+1!.*2'!$)!/,+*$+!
456$&4)1*6!$))6=&$'B!X)5,05+$!_ga!+#1+*C)5,+1!&#!$%+!
/,+*$&)#!)3!*!456$&4)1*6!$))6=&$!&#!.%&/%!456$&4)1*6!
'/+#*,&)'!/)561!F+!4)1+66+1!5'&#0!3&#&$+!'$*$+!
4*/%&#+'B!S!'&4&6*,!*((,)*/%!.*'!$*=+#!.&$%!$%+!
4)1+6&<*$&)#!C&*!789"8:-!F5$!&#!*!4),+!4)156*,!.*2B!
X)5/%+$!+$!*6B!_Va!(,)()'+1!*!/)4()#+#$EF*'+1!
*((,)*/%!/*66+1!"OS]^!$%),)50%62!F*'+1!)#!$%+!
OS]^_`a!1+'&0#!'(*/+B!@%+'+!/)4()#+#$'!/)C+,!
g!
!
+6+4+#$*,2!$*'='-!4)1*6&$2E1+(+#1+#$!$*'='!),!0+#+,&/!
$*'='!6&=+!35'&)#B!@%&'!/)4()#+#$'EF*'+1!*((,)*/%!%*'!
F++#!5'+1!$)!/,+*$+!*!/)4(,+%+#'&C+!)(+#E')5,/+!
$))6=&$!/*66+1!f(+#"#$+,3*/+!_Ja-!&#!.%&/%!/)4()#+#$'!
*,+!/)#3&05,+1!C&*!O"I:!L8:!3&6+'-!*#1!*!0,*(%&/*6!
+1&$),B!!
/-4'1;#2-4(
f3!$%+!$))6=&$'!(,+'+#$+1!*F)C+-!)#62!f(+#"#$+,3*/+!&'!
.&1+62!*C*&6*F6+!*'!)(+#!')5,/+!')3$.*,+B!
f(+#"#$+,3*/+!%*'!F++#!1+'&0#+1!*'!*#!&#$+0,*$+1!
+#C&,)#4+#$-!&#!.%&/%!+C+,2!(*,$!)3!*!456$&4)1*6!
*((6&/*$&)#!%*'!$)!F+!1+'&0#+1-!3,)4!$%+!&#(5$!$)!$%+!
)5$(5$B!@%+!*((,)*/%!$*=+#!F2!?+(%*&'@A!*#1!789"8:!
&'!,*1&/*662!1&33+,+#$-!&#!$%*$!$%+!$))6=&$!*/$'!4),+!*'!*!
b(650&#c!$)!*!0&C+#!456$&4)1*6!*((6&/*$&)#-!*#1!)#62!
4*#*0+'!$%+!456$&4)1*6!&#(5$!(*,$!*#1!$%+&,!35'&)#B!
@%+!&1+*!F+%&#1!$%&'!&'!$)!6+$!1+C+6)(+,'!/,+*$+!$%+&,!
*((6&/*$&)#!.&$%!$%+&,!5'5*6!$))6'!*#1!+*'&#0!$%+!
1+C+6)(4+#$!)3!$%+!456$&4)1*6!&#(5$!(*,$!.&$%!+*'2E$)E
(,)0,*4!4+/%*#&'4'-!*'!)(()'+1!$)!*#!&#$+0,*$+1!
*((,)*/%!&#!.%&/%!*!1+C+6)(+,!.)561!%*C+!$)!6+*,#!*!
/)4(6+$+62!#+.!+#C&,)#4+#$!3,)4!$%+!0,)5#1!5(B!75/%!
*!4)156*,!*((,)*/%!*66).'!+*'2!(,)$)$2(&#0!)3!
456$&4)1*6!*((6&/*$&)#'G!&$!&'!%).+C+,!$)!F+!#)$+1!$%*$!
/,+*$&)#!)3!*!3566E36+10+1!*((6&/*$&)#!/)561!,+K5&,+!*!
4),+!*1C*#/+1!*((,)*/%!'5/%!*'!$%+!)#+!)3!
f(+#"#$+,3*/+B!!
@)!/)#/651+-!.+!F+6&+C+!&#!*#!*((,)*/%!.&$%!$.)!$&0%$62!
6&#=+1!/)4()#+#$'D!)#!)#+!'&1+!%54*#E4*/%&#+!1&*6)0!
1+'/,&($&)#!F2!4+*#'!)3!*!L8:!3&6+-!*#1!)#!$%+!)$%+,!
'&1+!*!$))6!&4(6+4+#$&#0!$%&'!1&*6)0!F2!4+*#'!)3!
456$&4)1*6!35'&)#!4+/%*#&'4'B!M&#*662-!?+(%*&'@A!*#1!
789"8:!*,+!'$&66!*!.),=!&#!(,)0,+''-!*#1!.&66!F+#+3&$!&#!
$%+!#+*,!35$5,+!3,)4!+H$+#'&C+!C*6&1*$&)#B!@%&'!
C*6&1*$&)#!.&66!$*=+!(6*/+!&#!$%+!/)#$+H$!)3!$%+!
8+8)156+'!;%$$(DPP...B4+4)156+'B/%>!(,)h+/$!*#1!
$%+!7.&''!iOO]!"8J!(,)h+/$!;%$$(DPP...B&4JB/%>!_RaB!!
B*3*%*4'*#(
!"#$ S,*=&-!8B-!@*/%&F*#*-!AB!856$&4)1*6!I&*6)0!
I+'/,&($&)#!:*#05*0+!3),!]*(&1!72'$+4!I+C+6)(4+#$B!
!"#$%&'(&)*+&,)*&-./0123&4#"56*#7&#8&916$#:"6+&280&
9123#;:+-!721#+2-!S5'$,*6&*-!W562!JQQeB!
!%#$ X+#)&$-!SB-!X)##*51-!:B-!O*(6&+,-!:B-!I*4)5'&'-!"B-!
@<)C*,*'-!IB-!W)5,1+-!MB-!i&0*2-!:B-!7+,,*#)-!8B-!
:*.')#-!WBEjB!856$&4)1*6!7&0#*6!Y,)/+''&#0!*#1!
"#$+,*/$&)#!3),!*!I,&C&#0!7&456*$),D!O)4()#+#$EF*'+1!
S,/%&$+/$5,+B!"#!W89"-!k)6!N-!i)!N!;JQQR>B!
!&#$ X)5/%+$-!WB-!i&0*2-!:B-!*#1!l*#&66+-!@B!"OS]^!
7)3$.*,+!O)4()#+#$'!3),!]*(&162!I+C+6)(&#0!
856$&4)1*6!"#$+,3*/+'B!"#!!"#$%&#(&.<=.>?@@AB!7$*$+!
O)66+0+-!Y+##'26C*#&*-!97S-!f/$)F+,!JQQgB!
!'#$ X)5,05+$-!8B!:B!S!@))6=&$!3),!O,+*$&#0!*#1!@+'$&#0!
856$&4)1*6!"#$+,3*/+!I+'&0#'B!"#!("#$%&#(&C.-D>@?B&
Y*,&'-!f/$B!JQQJB!
!(#$ O)5$*<-!WB-!i&0*2-!:B-!7*6F+,-!IB-!X6*#13),1-!SB-!
8*2-!WB!*#1!j)5#0-!]B!M)5,!^*'2!Y&+/+'!3),!S''+''&#0!
$%+!9'*F&6&$2!)3!856$&4)1*6!"#$+,*/$&)#D!@%+!OS]^!
(,)(+,$&+'B!"#!!"#$%&#(&.EDFGH<D>IJB&:&66+%*44+,-!
i),.*2-!W5#+!N[[`B!
!)#$ A*$'5,*1*-!AB-!i*=*45,*-!jB-!j*4*1*-!?B-!*#1!
i&$$*-!@B!JQQVB!L"7:D!*!6*#05*0+!3),!1+'/,&F&#0!
456$&4)1*6!&#$+,*/$&)#!'/+#*,&)'B!"#!!"#$%&#(&.<=.&
?@@KB!k*#/)5C+,-!X,&$&'%!O)654F&*-!O*#*1*-!i)CB!JQQVB!
!*#$ :*6*##+-!IB-!^C+K5)<-!MB-!]&0*4)#$&-!8B-!I54*'-!
XB-!"#0)61-!]B!SS#!+0)E/+#$,&/!*#1!$*#0&F6+!*((,)*/%!$)!
4++$&#0!&#1+H&#0!*#1!F,).'&#0B!"#!!"#$%&#(&=L=.M@,-!
X,#)-!O<+/%!]+(5F6&/-!W562!JQQRB!!
!+#$ 7$*#/&56+'/5-!SB-!:&4F)5,0-!mB-!k*#1+,1)#/=$-!WB-!
8&/%)$$+-!XB-!*#1!8)#$+,)-!MB!JQQ`B!S!$,*#'3),4*$&)#*6!
*((,)*/%!3),!456$&4)1*6!.+F!5'+,!&#$+,3*/+'!F*'+1!)#!
9'&L8:B!"#!!"#$%&#(&.<=.M@JB!@),+#$)-!"$*62-!f/$B!JQQ`B!
XSED: notations to describe statusevent ubiquitous computing systems
Jair C Leite
Universidade Federal do Rio
Grande do Norte
Av. Sen. Salgado Filho, 3000
Natal, RN 59072-970 Brazil
jair@dimap.ufrn.br
Antonio Cosme
Universidade Federal do Rio
Grande do Norte
Av. Sen. Salgado Filho, 3000
Natal, RN 59072-970 Brazil
Abstract
Ubiquitous systems require a flexible and scalable
developing environment. Since they are spread in the
environment, they should deal with both status and
event phenomena. The XML Status-Event Description
(XSED) is a notation to describe configurable systems,
which can be developed by linking status-event
components. We have developed a software tool to
generate a configuration of SE components from XSED.
The configuration is itself a SE component wrapped as
a Java Bean that can be plugged in a middleware to
ubiquitous system.
antoniocosme@hotmail.com
Keywords
Status-event, XML, software languages, ubiquitous
computing
ACM Classification Keywords
D3.2. Language Classification: Very-high level
languages.
Introduction
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
Computing systems are become embedded into the
environment. They provide a natural and transparent
interaction where sensors provide input data by
monitoring user’s activities and other environmental
variables. These data also can be used to control
2
actuators in the environment. The environment can be
a local site or a wide geographical area.
These ubiquitous systems need to be flexible and
scalable. They should be flexible to allow easy
modification of components, configuration, and control
and temporal information. They should be scalable to
allow new input and output components to be plugged
or unplugged.
In ubiquitous or pervasive computing environments,
sensors could monitor different kinds of phenomena.
Status phenomena as temperature, pressure, sound are
those that are more permanent in time. Event
phenomena such as people entering in a place or an
alarm ringing are those that happen at a particular
moment. Of course, at computing level, these sensor
values are translated into discrete data samples at
particular moments. However, at design time, it is
more appropriate to deal with those important
conceptual differences.
We are motivated by the idea that to deal with both
status and event information, ubiquitous systems
should be implemented as a collection of status-event
(SE) components. The SE components are linked
together following a configuration. It is possible to have
different configuration to the same collection of SE
components defining different systems. New
components can be added to a configuration. The
properties of a link between two components are
described using annotations. Annotations are
descriptions about the rate and the initiative of data
sample.
The objective of our project is to develop ubiquitous
systems based on configuration of SE components. In
order to achieve that goal, we have developed a
collection of notations using XML. The notations are
collectively called XSED (pron. exceed) – XML Status–
Event Description.
The goal of this paper is to present briefly the XSED
notations and a tool that we use to develop prototypes
using them. The tool parses the XSED files and
generates Java implementation of a configuration of
linked SE components. The implementation is by itself a
SE component wrapped as a Java Bean.
The XSED notations
A simple scenario illustrates an application of statusevent systems. Consider a typical office or room where
temperature sensors provide information to control the
air-conditioner. The room has also sensors that inform
if someone enters or leaves the room. If the ambient
temperature is above a set value and there is someone
in the room, the system should turn the air-conditioner
on. The system recognizes registered persons when the
sensor triggers events. The system can also poll the
temperature sensor to get its status.
Using XSED, we describe the system using a
configuration of components. A configuration is a
collection of named component slots with typed
input/output nodes and linkage between them. In the
XSED configuration file, each component lists its inputs
and outputs each with a status/event tag and type (see
figure 1 for an example). A <links> section specifies
the connections between the components.
3
A small fragment of this configuration is described
using XSED in figure 1. The component named ‘world’
is the connection of the system with the environment.
Its input and output are those ones that link the system
with the environment.
The “person” component monitors if someone enter or
leaves the room. There should be a component for each
person. It receives the events “person enters” and
“person leaves” that allows it to inform the status of
the person. The component also has output events to
fire the “movement” event to another component
whenever there is someone entering or leaving.
There are more two SE components in our example,
but they are not shown in figure 1. The “room-status”
component receives the status of all person
components and output the status informing if there is
somebody in the room. The “turn-on” component is
triggered by movement event, which is the output
event of the person component. When an event
happens, this component checks the room-status and
the room-temperature. If there is someone in the room
and the temperature is above the set temperature this
component fires an event to turn the air-conditioner on.
Each component is described in a particular file using
XSED notation for components. A single SE component
has initial input and output declarations each of which
may be a status or event. This may be followed by
default status–status mappings giving output status in
terms of input status. The states also contain status–
status mappings. Finally the transitions describe the
effects of events in each state. Each transition has a
single input event that triggers the transition and a
condition.
The components are dynamically linked. For instance,
when a component requests an input status, it asks
dynamically ask to the configuration component to its
linked component. A detailed description is in [2].
<config>
<export name="world" />
<components>
<comp name="world">
<input>
<event name="person_enters" type="void" />
<event name="person_leaves" type="void" />
<status name="room-temp" type="float" />
<status name="set-temp" type="float" />
</input>
<output>
<event name="turn-on" type="int" />
</output>
</comp>
<comp name="person">
<input>
<event name="person_enters" type="void" />
<event name="person_leaves" type="void" />
</input>
<output>
<event name="movement" type="void" />
<status name=”person_in” type=”bool” />
</output>
</comp>
... more components ...
</components>
<links>
<status id="3562">
<from name="person" port="person_in" />
<to name="room-status" port="person_in" />
</status>
... more links ...
<event id="32">
<from name="person" port="movement" />
<to name="turn-on" port="movement" />
</event>
... more links ...
</links>
</config>
figure 1. XSED description of a configuration.
4
In the annotations description (not shown in this paper
due to limited space), unique link identifiers refer to
links in a configuration file and specify properties of
those links that can then be used to optimize the
runtime behavior of the system. The annotations
include: initiative, which defines whether status links
should be demand driven or data driven; time, which
defines the timeliness of a link; last-or-all, when a
sequence of events are fired whether all of them are
important or just the last; synchronization, which
informs the timeliness annotations can mean that
events and status change are not passed on in the
same temporal order as they are produced.
Conclusion
XSED allows descriptions of systems that include both
status and event phenomena to be included naturally
and without having to prematurely transform the status
into discrete events. The notation separates
components, configuration, linking and annotation
allowing reuse, scalability and flexibility. It also allows
global analysis (by hand as now, or in future
automated) to feed into optimization of the execution
over the infrastructure. We have been working on a
software tool that generates a core component to
process both status and events in ubiquitous systems.
Acknowledgements
It is very easy to add new components. For example,
we can add new “person” components to monitor
different people. The room-status component can be
easily modified to inform the number of people in the
room (another status output).
The authors would like to thanks to Alan Dix and Adrian
Friday to previous support on XSED definition. Jair
would like to thanks CAPES Brazilian Foundation for
financial support.
References
Generating SE Components to ECT Platform
The Equator Component Toolkit [3] (ECT) is a collection
of JavaBeans ‘components’ that provides a library of
tools for constructing interactive ubiquitous computing
applications. We have developed a software tool that
generates a Java Bean version of the SE Component to
run in the ECT platform [2]. SE components are
transformed into Beans using a wrapper class
generated from the schema. It will enable us to
experiment the prototype with sensors, actuators and
other components that have existing drivers/wrappers
for ECT. For each input and output, both status and
event, a Java slot is provided, but these are expected
to be used differently depending on the type of node.
[1] Dix, A.: Status and events: static and dynamic
properties of interactive systems. in Proc. of the
Eurographics Seminar: Formal Methods in Computer
Graphics. Marina di Carrara, Italy. (1991).
[2] Dix, J. Leite, and A. Friday (2007). XSED – XMLbased Description of Status–Event Components and
Systems. In Proceedings of Engineering Interactive
Systems 2007, IFIP WG2.7/13.4 Conference,
Salamanca, March 22-24, 2007, LNCS (to appear).
[3] Greenhalgh, C., Izadi, S., Mathrick, J., Humble, J.
and Taylor, I. A Toolkit to Support Rapid Construction
of Ubicomp Environments. Proc. of UbiSys 2004.
Nottingham, UK. Available online at
http://ubisys.cs.uiuc.edu/2004/program.html.
UIDLs for Ubiquitous Environments
Fabio Paternò & Carmen Santoro
ISTI-CNR.
ACM Classification Keywords
Via Moruzzi 1.
H5.m. Information interfaces and presentation
Pisa, 56124 Italy
{fabio.paterno, carmen.santoro}@isti.cnr.it
Abstract
In this paper we discuss some issues that seem
particularly relevant for the next generation of UIDLs
for ubiquitous environments. In particular, starting with
an analysis of the state of art in the area of XML
languages for multi-device environments we indicate
areas in which further research work is necessary in
order to obtain more general solutions able to address
the complexity underlying ubiquitous environments.
Keywords
Ubiquitous environments, Usability and Accessibility,
Guidelines, End user development,
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
Introduction
Recent years have seen the introduction of many types
of interactive devices (e.g. cellphones, PDA's, WebTV,
etc.) and the availability of such a wide range of
devices has become a fundamental challenge for
designers of interactive software systems. Users wish
to be able to seamlessly access information and
services regardless of the device they are using, even
when the system or the environment changes
dynamically.
To address such issues a number of XML-based
languages (such as TERESA XML, USIXML, XIML) have
been proposed to represent user interfaces at different
abstraction levels (through model-based approaches)
and then enable transformations to generate
corresponding user interface implementations for
different languages and adapted to the resources
available in the target devices. Such concepts have
started to be adopted even in international standards,
such as the W3C XForms. In some cases even reverse
engineering transformations have been defined, which
are able to build logical descriptions starting with
existing implementations. This allows the reuse of
content already available for the design of new user
interface versions. Such transformations can be
2
incorporated in authoring tools (such as TERESA [7]) or
can be part of software infrastructures able to
dynamically generate user interfaces adapted to the
device at hand during the user session. However, many
issues have still to be solved in order to obtain general
solutions for user interfaces in ubiquitous
environments. Some of them are discussed in this
position paper.
End User Development
One fundamental challenge for the coming years is to
develop environments that allow people without
particular background in programming to develop their
own applications. Indeed, the increasing interactive
capabilities of new devices have created the potential to
overcome the traditional separation between end users
and software developers. End User Development (EUD)
[4] is a set of methods, tools, and techniques that allow
people, who are non-professional developers, at some
point to create or modify a software artefact. In this
perspective, tasks that have traditionally been
performed by professional software developers are
transferred to the users, who need to be specifically
supported in performing these tasks. New
environments able to seamlessly move between using
and programming (or customizing) can be designed. In
the case of user interfaces obtained through
transformations starting with logical descriptions the
issue is how to allow end users to customize the
transformation rules. This implies that even such rules
be declaratively specified externally to the software
performing the transformations. Such specifications
should be represented and manipulated through
representations that are understandable for end users.
In addition, users should also able to easily
understand the impact of the modifications performed
on the transformation rules specifications on the
resulting user interfaces.
The overall goal is to reach natural development, which
implies that people should be able to work through
familiar and immediately understandable
representations that allow them to easily express and
manipulate relevant concepts, and thereby create or
modify interactive software artefacts. On the other
hand, since a software artefact needs to be precisely
specified in order to be implemented, there will still be
the need for environments supporting transformations
from intuitive and familiar representations into more
precise, but more difficult to develop, descriptions. In
this context Programming by Example PBE techniques
can be useful. They have existed since 1975, the basic
idea is to teach the computer new behaviour by
demonstrating actions on concrete examples. An
example of how to combine PBE and multi-device Web
interface design is discussed in [5].
Generating User Interface for All
The research work in model-based design and in the
accessibility area have some potential synergy, which
has not yet been investigated. On the one hand, model
based approaches aim to provide user interface
descriptions where the logical function of each element
or construct is highlighted removing useless
implementation details, on the other hand disabled
users often need support in order to orientate
themselves. For example, when blind users access their
Web pages through screen readers they can scan
sequentially all the interface controls or all the links and
so on. However, they often need to first have a
general overview of the page content, which means a
logical description indicating where is the heading, the
3
navigation bar, the search interface, the content part
and so on. Accessibility aims to increase the number of
users who can access a system, this means to remove
any potential technical barrier that does not allow the
user to access the information. Usability aims to make
the user interaction more effective, efficient and
satisfactory. Guidelines able to integrate accessibility
and usability for specific classes of users, such as
vision-impaired users [3] have been developed and
specified in XML-based languages. They have been
developed in such a way that they should indicate the
user interface features that allow the information and
the associated services, to be easily accessed by users
with special needs. In the accessibility area some tools
(for example MAGENTA [2]) able to check guidelines
specified in XML externally to the tool implementation
have been developed as well.
At this point, it would be interesting to obtain an
integration of model-based transformations with
accessibility and usability guidelines in such a way that
the transformation rules be able to generate
dynamically user interfaces depending on the target
users by applying the set of guidelines that are more
relevant for the current users and devices. Such
transformations could be dynamically enriched by
adding new guidelines expressed through the same
guideline specification language without requiring
changes in the implementation of the software tool
supporting the transformations generating the user
interfaces.
Generating User Interfaces for Dynamically
Composed Web Services
Service-oriented architecture (SOA) is an architectural
style whose goal is to achieve loose coupling among
interacting services. SOA enables dynamic, flexible
applications which can always change rapidly by
integrating new services as well as old legacy systems.
Web service technology makes services available
independently from the implementation of a particular
application. There is a need for improvements in web
service technologies regarding to the semantic
description of service functionality, service operations
and operation parameters, extending the descriptions
in existing standards, such as web service description
language (WSDL), in order to better support user
interface generation.
One interesting research area is how to generate userinterfaces for dynamically composed web services.
Generating user interface elements for services is the
topic of several research efforts but there are still many
open issues. For example, WSGUI deals with GUI
inference from Web Service description, manually
adding GUI hints to overcome limited self-information
of services [8]. There is a need for more general
solutions able to take Web service descriptions, which
can be dynamically composed, and generate the
corresponding user interface adapted to the context of
use. This can be useful for example in domotic
applications, in which middleware for domotic
interoperability (such as Domonet [6]) can abstract a
wide variety of domotic appliances (including media
centers, security systems, light and temperature
controls and so on), which may use various
communication systems (Konnex, UpnP, BTicino, …),
and make their functionalities and states available
through Web services. Such services dynamically
depend on the domestic appliances available and
enabled. At this point, a run-time environment should
be able to dynamically generate user interfaces for
4
various possible interaction devices allowing users to
control the appliances populating their home anywhere.
Migratory User Interfaces
Migratory user interface allow users to dynamically
change device and continue task performance from the
point they left off on the source device. While some
research work has been dedicated to support migration
from one to device to another one [1] or to partially
migrate a user interface (for example for showing
content on large screen and controlling interaction
through a mobile device), there is the need for further
research work to support migration when it involves
multiple source and target devices and when it involves
multi-user applications, in which each user can change
device. In the case of multiple devices, there is a need
to support coordination among them in such a way that
the task distribution among them be as usable as
possible and the state can be efficiently preserved
when moving from one set of devices to another one.
Conclusions
While a number of XML-languages for supporting multidevice interfaces has been proposed, with associated
tools, the complexity of ubiquitous environments still
requires better solutions able to address the complexity
of the relevant aspects in order to obtain usable and
accessible solutions.
Some of the issues that need to be particularly
considered are introduced and briefly discussed in this
position paper. The overall goal is to obtain user
interface description languages for ubiquitous
environments, which allow for expressing and
modelling relevant aspects in such environments:
interdependencies and configuration options of
ubiquitous technologies, their dynamic behaviour, their
privacy/visibility effects, their reliability aspects,
together with device and environment descriptions and
user-related aspects and their relationships
References
[1] R.Bandelloni, G.Mori, F.Paternò, Dynamic
Generation of Migratory Interfaces, Proceedings Mobile
HCI 2005, ACM Press, pp.83-90, Salzburg, September
2005.
[2] B.Leporini, F.Paternò, A.Scorcia, Flexible Tool
Support for Accessibility Evaluation, Interacting with
Computers, Vol.18, N.5, 2006, pp.869-890, Elsevier.
[3] B.Leporini, F.Paternò, Increasing Usability when
Interacting through Screen Readers, International
Journal Universal Access in the Information Society
(UAIS), Springer Verlag, Vol.3, N.1, pp.57-70, 2004.
[4] H.Lieberman, F.Paternò, W.Wulf (eds), End-User
Development, Springer Verlag, ISBN-10 1-4020-422052006.
[5] J. A. Macías, F. Paternò, Customization of Web
applications through an intelligent environment
exploiting logical interface descriptions, Interacting with
Computers, In Press, Corrected Proof, Available online
6 August 2007.
[6] Miori, V. Tarrini, L. Manca, M. Tolomei, G., An
open standard solution for domotic interoperability,
IEEE Transactions on Consumer Electronics, Volume:
52, Issue: 1, pp: 97- 103.
[7] G. Mori, F. Paternò, C. Santoro, Design and
Development of Multi-Device User Interfaces through
Multiple Logical Descriptions, IEEE Transactions on
Software Engineering, August 2004, Vol.30, N.8,
pp.507-520, IEEE Press.
[8] Spillner, J.; Braun, I.; Schill, A.: Flexible Human
Service Interfaces, ICEIS 2007, accepted for publication
D. UIDLs for Task Analysis and Virtual Environments
Gerrit Meixner, Nancy Thiels
University of Kaiserslautern
Tool Support for Task Analysis
Volker Paelke
Leibniz Universitaet Hannover, IKG
Spatial Content Models and UIDLs for Mixed Reality Systems
Chris Raymaekers, Lode Vanacken, Joan De
Boeck, Karin Coninx
Hasselt University
High-Level Descriptions for Multimodal Interaction in Virtual
Environments
Chadwick Wingrave
Virginia Tech
Chasm: A Tiered Developer-Inspired 3D Interface Representation
!
Tool Support for Task Analysis
!
!
5*%%2$(0*264*%(
(
9#&5*+'&$=!,4!X)&'*+'2)-$*+#!
?,$$2&*3.:)&12*+.U$+>!YL!
ZQZZT!X)&'*+'2)-$*+#B!?*+1)#=!
1*&C#*+[15>-#&.A2>0*!
!
7&4'+(892*1#(
?*+1)#!G*'*)+/%!M*#$*+!4,+!E+$&4&/&)2!"#$*22&@*#/*!6:HX";!
\+&(('$)0$*+'$+>!WLL!
ZQZZT!X)&'*+'2)-$*+#B!?*+1)#=!
#)#/=>$%&*2'[04A&>0*!
!"#$%&'$(
"#!$%&'!()(*+!,-+!-'*+./*#$*+*0!1,0*2!3)'*0!-'*+!
&#$*+4)/*!0*5*2,(1*#$!(+,/*''!6789":;!<&22!3*!
&#$+,0-/*0!<&$%!)!/2*)+!4,/-'!,#!$%*!)#)2='&'!(%)'*>!
?*$$&#@!$)'A!&#4,+1)$&,#!4+,1!-'*+'!&'!)!/%)22*#@&#@!
(+,32*1B!<%&/%!&'!*5*#!1,+*!/,1(2*C!4,+!-'*+!&#$*+4)/*!
0*'&@#*+B!3*/)-'*!$%*=!%)5*!,#2=!2&$$2*!A#,<2*0@*!
)3,-$!$)'A!1,0*2&#@>!D*22.A#,<#!)#)2='&'!1*$%,0'!<&22!
3*!/,13&#*0!3=!$%*!'-((,+$!,4!)#!)#)2='&'!$,,2B!<%&/%!
%*2('!*#$*+&#@B!'$+-/$-+&#@B!*5)2-)$&#@!)#0!*C(,+$&#@!
-'*+!&#4,+1)$&,#B!*>@>!$)'A'B!(+*4*+*#/*'!)#0!'$+-/$-+)2!
1*#$)2!1,0*2'>!E22!&#4,+1)$&,#!&'!'$,+*0!&#!)!(+,F*/$!
0)$)3)'*!<%*+*!0*5*2,(*+'!/)#!)//*''!&#4,+1)$&,#!
0-+&#@!*5*+=!(%)'*!&#!$%*!0*5*2,(1*#$!(+,/*''>!
"#4,+1)$&,#!&'!/2)''&4&*0!&#$,!0&44*+*#$!@+,-('B!<%*+*!
*5*+=!@+,-(!%)'!0&44*+*#$!*C(,+$!1*/%)#&'1'>!G*'-2$'!
4+,1!$%*!)#)2='&'!(%)'*!)+*!-'*0!&#!4-+$%*+!(%)'*'B!
$%-'!$%*+*!&'!#,!1*0&)!3+*)A!)#=1,+*>!H-$-+*!9'*+!
&#$*+4)/*!0*'/+&($&,#!2)#@-)@*'!69":I;!%)5*!$%*#!)!
/2*)+!'$)+$&#@!(,&#$!4,+!0*5*2,(&#@!-'*+./*#$*+*0!-'*+!
&#$*+4)/*'>!
)*+,-%.#(
9'*+./*#$*+*0!0*'&@#B!-'*+.&#$*+4)/*!0*'&@#B!789":B!
$)'A!1,0*2'B!&#4,+1)$&,#!(+,/*''&#@B!-'*7I!
M,(=+&@%$!&'!%*20!3=!$%*!)-$%,+O,<#*+6';>!
MJ"!LPPQB!E(+&2!LR!S!7)=!TB!LPPQB!U)#!V,'*B!9UE!
!/0(/1&##232'&$2-4()*+,-%.#(
EM7!W.CCCCCCCCCCCCCCCCCC>!
JK>L>!"#4,+1)$&,#!&#$*+4)/*'!)#0!(+*'*#$)$&,#!6*>@>B!
JM";N!9'*+!"#$*+4)/*'!
L!
!
:4$%-.;'$2-4(
"#!+*/*#$!=*)+'!-'*+!&#$*+4)/*!69";!0*5*2,(*+'!%)5*!
+*/,@#&]*0!$%)$!4-$-+*!-'*+'!)#0!$%*&+!$)'A'!)+*!$%*!
1,'$!&1(,+$)#$!4)/$,+!4,+!$%*!'-//*''!,4!)!9">!H,+!
0*5*2,(&#@!-'*+./*#$*+*0!',4$<)+*!#*<!(+,/*''*'!%)0!
$,!3*!0*5*2,(*0>!^,+1)22=!)!-'*+./*#$*+*0!0*'&@#!
(+,/*''!'$)+$'!<&$%!$%*!)#)2='&'!,4!-'*+!$)'A'!_W`>!
\%*'*'!$)'A'!/)#!3*!1,0*2*0!)'!)!$)'A!1,0*2!<&$%!$%*!
#,$)$&,#'!,4!*>@>!9'*<)+*!7)+A-(!I)#@-)@*!6-'*7I;!
_L`!,+!M,#/-+\)'A\+**!6M\\;!_T`>!\)'A!1,0*2'!)+*!)!
5*+=!-'*4-2!'$)+$&#@!(,&#$!4,+!$%*!0*5*2,(1*#$!,4!
4-+$%*+!1,0*2'!&#!789":!)#0!%*2(!$,!@-)+)#$**!)!-'*+.
/*#$*+*0!0*'&@#!_Y`>!:&44*+*#$!)((+,)/%*'!*C&'$!<%&/%!
@*#*+)$*!$%*!(+*'*#$)$&,#!1,0*2!,-$!,4!$%*!$)'A!1,0*2!
_K`>!E!9'*+!"#$*+4)/*!:*'/+&($&,#!I)#@-)@*'!69":I;!&'!
,#*!1*$%,0!$,!/,5*+!$%*!+*a-&+*1*#$'!,4!)!
(+*'*#$)$&,#!1,0*2>!\,0)=B!1,'$!9":I!)+*!3)'*0!-(,#!
b7I!)#0!1)#=!$,,2'!*C&'$!<%&/%!%*2(!0*5*2,(*+'!$,!
'-((,+$!()+$'!,4!789":B!*>@>!\cGcUE!_Z`>!d#*!,4!$%*!
1)&#!(+,32*1'!&#!0*5*2,(&#@!$)'A!1,0*2'!'$&22!+*1)&#'!
S!%,<!$,!@*$!$%*!A#,<2*0@*!,4!)!0,1)&#!*C(*+$!&#$,!
789":!_W`>!"#!$%&'!()(*+!,#*!(,''&32*!',4$<)+*!
'-((,+$*0!',2-$&,#!&'!'%,<#>!!
\%&'!(,'&$&,#!()(*+!&'!'$+-/$-+*0!)'!4,22,<'N!$%*!
4,22,<&#@!'*/$&,#!@&5*'!)!'%,+$!&#$+,0-/$&,#!$,!$%*!-'*+.
/*#$*+*0!789":!(+,/*''!<&$%!4,/-'!$,!$%*!)#)2='&'!,4!
-'*+!$)'A'!)#0!+*a-&+*1*#$'>!\%*#!$%*!0*5*2,(*0!
)#)2='&'!$,,2!<&22!3*!&22-'$+)$*0>!H&#)22=!',1*!0&+*/$&,#'!
4,+!4-+$%*+!<,+A!)+*!0&'/-''*0>!
<#*,&%*(=4>24**%24>(?%-'*##(
\%*!2*5*2!,4!)//*($)#/*!)#0!*44&/&*#/=!,4!)!1,0*+#!-'*+!
&#$*+4)/*!)+*!#,$!)$!2)'$!0*$*+1&#*0!3=!$%*!*)'*!,4!-'*>!
U='$*1!0*5*2,(1*#$!%)'!3**#!)05)#/*0!3=!$%*!
9'*<)+*!*#@&#**+&#@!(+,/*''!_R`>!\%&'!(+,/*''!&'!1)0*!
-(!,4!$%*!4,22,<&#@!(%)'*'N!)#)2='&'B!'$+-/$-+)2!0*'&@#B!
0*'&@#B!&1(2*1*#$)$&,#!)#0!*5)2-)$&,#!6'**!H&@>!W;>!\%*!
&#0&5&0-)2!(%)'*'!)+*!#,$!$,!3*!5&*<*0!&#!&',2)$&,#!4+,1!
,#*!)#,$%*+B!3-$!+)$%*+B!)'!,5*+2)((&#@>!\%*!
*5)2-)$&,#B!)'!)!/,#$&#-)$&,#!,4!$%*!)#)2='&'B!()+)22*2'!
$%*!<%,2*!(+,/*''!_Q`>!!
!
@2>;%*A(B 9'*<)+*!*#@&#**+&#@!(+,/*''!!
\%*!(+&1)+=!/,#'&0*+)$&,#'!&#!$%&'!*5,2-$&,#)+=!(+,/*''!
)+*!)2<)='!$%*!+*a-&+*1*#$'!)#0!#**0'!,4!$%*!-'*+!4,+!
<%,1!$%*!-'*+!&#$*+4)/*!&'!3*&#@!0*5*2,(*0>!\%&'!&'!$%*!
,#2=!@-)+)#$**!4,+!)#!*44&/&*#$!-'*!,4!$%*!'='$*1>!
\%*+*3=!0&44*+*#$!1*$%,0'!/)#!3*!-'*0!$,!@)$%*+!$%*!
+*2*5)#$!0)$)N!H&+'$!,4!)22!-'*+!&#$*+5&*<'!)+*!3*'$!
1*$%,0'!4,+!$%&'!(-+(,'*B!)'!$%*=!/)#!3*!0*'&@#*0!5*+=!
42*C&32*!)#0!)!%-@*!)1,-#$!,4!0&44*+*#$!&#4,+1)$&,#!/)#!
3*!,3$)&#*0>!"#!/,#$+)'$!$,!$%)$B!a-*'$&,##)&+*'!)+*!
'$)$&/!)#0!+*a-&+*!1-/%!0*5*2,(1*#$!$&1*!3*4,+*%)#0>!
\%*+*4,+*!$%*&+!-')@*!&#!$%&'!/,#$*C$!&'!+)+*>!\,!'-+5*=!
1*#$)2!1,0*2'B!/)+0!',+$&#@!(+,5&0*'!)!5)2-)32*!3)'&'!
4,+!$%*!4,22,<&#@!'$+-/$-+&#@!(%)'*>!E4$*+!$%*!0)$)!
'-+5*=!$%*!*5)2-)$&,#!$)A*'!(2)/*!<%&2*!-'&#@!$%+**!
0&44*+*#$!0)$)!/)$*@,+&*'N!a-)#$&$)$&5*B!a-)2&$)$&5*!)#0!
'$+-/$-+)2>!\%*!2)$$*+!+*'-2$'!4+,1!$%*!1*#$)2!1,0*2'!,4!
a-*'$&,#*0!-'*+'>!\%*!,$%*+!$<,!)+*!,3$)&#*0!3=!
0&44*+*#$!a-*'$&,#*0!1*$%,0'!0-+&#@!$%*!)#)2='&'N!
a-)2&$)$&5*!0)$)!3=!'&#@2*!-'*+!'$)$*1*#$'e!a-)#$&$)$&5*!
0)$)!3=!-#&5,/)2!)#'<*+'>!U$+-/$-+)2!0)$)!)#0!+*'-2$&#@!
T!
!
4+,1!$%)$!$)'A!1,0*2'!4,+1!$%*!3)'&'!$,!0*+&5*!)!-')@*!
1,0*2!<&$%&#!$%*!#*C$!(%)'*>!\%&'!'$+-/$-+&#@!(%)'*!&'!
/%)+)/$*+&'*0!3=!-'*7I!_L`!<%&/%!/,#'&'$'!,4!-'*!
,3F*/$'!)#0!*2*1*#$)+=!-'*!,3F*/$'B!&>*>!f/%)#@*gB!
f+*2*)'*gB!f'*2*/$gB!f*#$*+g!)#0!f&#4,+1g>!\%*'*!,3F*/$'!
0*'/+&3*!)/$&,#'B!,(*+)$&,#'!)#0!)/$&5&$&*'!<%&/%!<*+*!
,3$)&#*0!3=!$%*!)#)2='&'!(%)'*>!D&$%!$%*'*!-'*!,3F*/$'!
)#0!4&5*!*2*1*#$)+=!-'*!,3F*/$'!&$!&'!(,''&32*!$,!0*5*2,(!
$%*!<%,2*!'$+-/$-+*!4,+!$%*!4&#)2!-'*+!&#$*+4)/*>!\%*!
+*'-2$!,4!$%*!'$+-/$-+)2!0*'&@#!(%)'*!&'!)!(2)$4,+1.
&#0*(*#0*#$!1,0*2!<%&/%!(+,5&0*'!$%*!4,-#0)$&,#!4,+!
$%*!4,22,<&#@!0*'&@#!(%)'*>!D&$%&#!$%&'!(%)'*!$%*!-'*!
/,#/*($!4,+!$%*!'*2*/$*0!%)+0<)+*!(2)$4,+1!&'!(+*()+*0!
,#!$%*!3)'&'!,4!$%*!/,#0-/$*0!)#)2='&'!)#0!$%*!-'*!
1,0*2!0*5*2,(*0>!H-+$%*+1,+*B!)!@2,3)2!#)5&@)$&,#!
/,#/*($B!)!0+)4$!,4!$%*!+*2)$*0!-')@*!'$+-/$-+*!)'!<*22!
)'!)!(+,(,'*0!2)=,-$!&'!0*5*2,(*0>!\%*!+*'-2$!,4!$%*!
0*'&@#!(%)'*!&'!)!'(*/&4&/!2)=,-$!,4!$%*!-'*+!&#$*+4)/*!
'='$*1>!\%*!2)=,-$!4,/-'*'!,#!'$)#0)+0!-'*+!$)'A'!)#0!
$%*!-')@*!'$+-/$-+*>!U&1-2$)#*,-'!*5)2-)$&,#!0-+&#@!)22!
,4!$%*!4,+1*+2=!1*#$&,#*0!(%)'*'!*#)32*'!-'*+'!$,!
$+)/A!)#0!)''*''!$%*!0*5*2,(1*#$!(+,@+*''!)$!)22!$&1*'!
,#!$%*!3)'&'!,4!'$+-/$-+*'!,+!(+,$,$=(*'>!\%*+*4,+*!
$&1*2=!+*'(,#'*!$,!0*'&+*0!/%)#@*'!,+!1,0&4&/)$&,#'!&'!
(,''&32*>!\%*!*5)2-)$&,#!&#/2-0*'!-'*+!'-+5*='!$,!
0*$*+1&#*!$%*!5)2&0&$=!,4!$%*!+*'-2$'!,4!'$+-/$-+&#@!)#0!
0*'&@#&#@>!
!4&1+#2#($--1(#;CC-%$(
\%+,-@%!$%*!1-2$&(2&/&$=!,4!$%*!@)$%*+*0!&#4,+1)$&,#!&#!
$%*!)#)2='&'!(%)'*!6'**!/%)($*+!9'*<)+*!c#@&#**+&#@!
h+,/*'';B!&$!&'!,#2=!(,''&32*!$,!@*$!)#!,5*+5&*<!3=!$%*!
-'*!,4!)#!*2*/$+,#&/!0)$)!'$,+)@*>!\%*!0*5*2,(*0!
)#)2='&'!$,,2!f-'*:E\Eg!6'**!H&@>!L;!(+*'*#$*0!'*+5*'!
$%*+*3=!$%*!/+*)$&,#!,4!)!/*#$+)2B!0*5*2,(1*#$.'(*/&4&/!
0)$)3)'*!<%&/%!@-)+)#$**'!)3,5*!)22!$%*!/,1(2*$*!
'-((2=!,4!)22!0)$)!+*'-2$*0!0-+&#@!$%*!0*5*2,(1*#$!
(+,/*''>!\%*!0*5*2,(*+!&'!)32*!$,!-'*!$%*!$,,2!4,+!
(+*()+&#@B!)01&''&,#!)#0!()''&#@!0)$)!$,!4-+$%*+!
(%)'*'!,4!$%*!0*5*2,(1*#$!(+,/*''>(
!
@2>;%*(DA!U/+**#'%,$!,4!$%*!)#)2='&'!$,,2!f-'*:E\Eg!
\%*!/,22*/$&,#!)#0!*5)2-)$&,#!,4!0)$)!/)#!3*!%)#02*0!
()+)22*2!<&$%!$%*!)#)2='&'!$,,2>!E$!)#=!$&1*!&$!&'!(,''&32*!
$,!*C)1&#*!)#0!/,#/+*$&]*!()+$&)2!+*'-2$'!6'**!H&@>!L;>!
\%*!/,22*/$&,#!)#0!*5)2-)$&,#!,4!0)$)!<&$%&#!f-'*:E\Eg!
$)A*'!(2)/*!4,+!*)/%!0)$)!/)$*@,+=!6'**!H&@>!T;>!
f-'*:E\Eg!&#/2-0*'!*C(,+$!1*/%)#&'1!4,+!(+*'*#$)$&,#!
)#0!0,/-1*#$)$&,#!0)$)!)'!<*22!4,+!$)'A!1,0*2'!6'**!
H&@>!T;>!\%*!0*5*2,(*+!%)'!$%*!(,''&3&2&$=!$,!*C(,+$!
a-)2&$)$&5*!0)$)B!&>*>!$*C$!6&#!G&/%!\*C$!H,+1)$;B!
a-)#$&$)$&5*!0)$)B!&>*>!0&)@+)1'!6)'!h,+$)32*!^*$<,+A!
?+)(%&/';!)#0!'$+-/$-+)2!0)$)B!&>*>!$)'A!1,0*2'!6-'*7I;>!
i-)2&$)$&5*!)#0!a-)#$&$)$&5*!0)$)!/)#!3*!-'*0!4,+!
(+*'*#$)$&,#!)#0!0,/-1*#$)$&,#!,4!)/a-&+*0!
&#4,+1)$&,#!0-+&#@!$%*!)#)2='&'!(%)'*>!U$+-/$-+)2!0)$)!
&#!$*+1'!,4!$)'A!1,0*2'!/)#!3*!*#+&/%*0!&#!4-+$%*+!
0*5*2,(1*#$!(%)'*'>!
Y!
!
f-'*:E\Eg!'%,-20!3*!1-2$&2&#@-)2!)#0!)#,$%*+!4*)$-+*!
/,-20!3*!$%*!&#$*@+)$&,#!,4!)#!*C(,+$!1*/%)#&'1!,4!
$)'A!1,0*2'!$,!M\\>!
G*3*%*4'*#!
[1] \)1B!G>M>B!7)-2'3=B!:>!)#0!h-*+$)B!E>G>!!"#$%&'('
#))*'+),'$*-.-/-01'!23,'#425'6)73*2'+,)8'9)84-0'
$:;3,/2<!"#!=,).<'>!>'?@@AB!((>!QQ.RP>!
!
@2>;%*(EA!"#(-$!)#0!,-$(-$!,4!0&44*+*#$!0)$)!/)$*@,+&*'!
/-4'1;#2-4#(&4.(3;$;%*(,-%F(
"#!$%&'!()(*+!$%*!9'*<)+*!*#@&#**+&#@!(+,/*''!<&$%!
4,/-'!$,!$%*!)#)2='&'!(%)'*!<)'!&#$+,0-/*0B!)'!<*22!)'!
$%*!f-'*:E\Eg!$,,2>!f-'*:E\Eg!<)'!$*'$*0!&#!'*5*+)2!
&#0-'$+&)2!(+,F*/$'!)#0!%*2(*0!$,!0*5*2,(!$)'A!1,0*2'>!
"$!%)'!&1(+,5*0!$%*!'$+-/$-+*0!)#)2='&'!)#0!+*0-/*0!
$%*!)1,-#$!,4!<,+A!4,+!*#$*+&#@B!'$+-/$-+&#@B!*5)2-)$&#@!
)#0!*C(,+$&#@!0)$)!$,!4-+$%*+!(%)'*'!,4!$%*!789":!
(+,/*''>!c'(*/&)22=!4-$-+*!9":I!1)=!3*#*4&$!4+,1!)!',2&0!
0)$)!3)'*!,4!-'*+!&#4,+1)$&,#>!\%-'!f-'*:E\Eg!3+&0@*'!
)!()+$!,4!$%*!@)(!3*$<**#!+*'*)+/%!&#!789":!)#0!
(+)/$&/)2!<,+A>!
E$!$%*!1,1*#$!)!b7I.3)'*0!2)#@-)@*!4,+!$%*!'$,+)@*!
,4!&#4,+1)$&,#!&'!0*5*2,(*0>!\%*!+*2)$&,#)2!0)$)!3)'*!
'$,+)@*!/,#/*($!<&22!3*!+*(2)/*0!3=!$%*!9'*<)+*!:)$)!
:*'/+&($&,#!I)#@-)@*!6-'*::I;>!:*5*2,(*+'!/)#!*)'&2=!
5&*<B!*0&$!)#0!*C/%)#@*!)#)2='&'!0)$)!)#0!0,#j$!#**0!
$,!*C$+)/$!0)$)!,-$!,4!$%*!0)$)!3)'*>!H-$-+*!5*+'&,#'!,4!
[2] G*-$%*+B!E>!B236%'C'DE2/384/-2.F3'$0/G-.5*B01'
H)0'642.F-030I37-302E2/3830'8-/'J6%>!#,>!R!&#!
H,+$'/%+&$$.8*+&/%$*!()AB!9#&5*+'&$=!,4!X)&'*+'2)-$*+#B!
LPPT>!
[3] h)$*+#kB!H>!6)73*"I4237'932-10'407'$H4*B4/-)0')+'
>0/3,4./-H3'(;;*-.4/-)0<!U(+&#@*+!l*+2)@B!Wmmm>!
[4] V,%#',#B!h>B!V,%#',#B!J>!)#0!D&2',#B!U>!G)(&0!
h+,$,$=(&#@!,4!9'*+!"#$*+4)/*'!0+&5*#!3=!\)'A!7,0*2'>!
"#!D.304,-)"K4237'932-10&'$0H-2-)0-01'L),5'407'
#3.F0)*)1E'-0'DE2/38'93H3*);830/B!V,%#!M)++,22!6*0>;B!
V,%#!D&2*=!n!U,#'>!((>!LPm.LYZB!WmmK>!
[5] M2*+/ACB!\>!)#0!M,#&#CB!X>!>0/31,4/-01'#425'6)73*2'
-0'(B/)84/-.'!23,'>0/3,+4.3'M303,4/-)0>!\*/%#&/)2!
G*(,+$!\G.I9M.c:7.PTPLB!c:7OI9M!:&*(*#3**AB!
8*2@&-1B!LPPT>!
[6] 7,+&B!?>B!h)$*+#k!H>!)#0!U)#$,+,B!M>!:*'&@#!)#0!
:*5*2,(1*#$!,4!7-2$&0*5&/*!9'*+!"#$*+4)/*'!$%+,-@%!
7-2$&(2*!I,@&/)2!:*'/+&($&,#'>!"#N!"ccc!h+*''N!>$$$'
#,4024./-)02')0'D)+/G4,3'$01-033,-01<'5,2>!TPB!#,>!RB!
LPPYB!((>!KPQ.KLP>!
[7] 8o0/%*+B!E>!63/F)7-2.F3'NB/OB0125)0/3:/"(04*E23'
4*2'M,B07*413'3-032'2/,B5/B,-3,/30'!D$L(P$"
$01-033,-01"=,)O32232<!#,>!WY!&#!H,+$'/%+&$$.8*+&/%$*!
()AB!9#&5*+'&$=!,4!X)&'*+'2)-$*+#B!LPPQ>!
[8] pq%2A*B!:>!6)73*"I4237'93H3*);830/')+'!23,'
>0/3,+4.32&'('N3G'=4,47-18'-0'!23G4,3'$01-033,-01<!
"#N!h+,/>!WP$%!"HEMO"H"hO"HdGUO"cE!U=1(,'&-1!,#!
E#)2='&'B!:*'&@#B!)#0!c5)2-)$&,#!,4!J-1)#.7)/%&#*.
U='$*1'!LPPQ>!
1
Spatial Content Models and UIDLs for
Mixed Reality Systems
Volker Paelke
Institute for Cartography and Geoinformatics
Leibniz Universität Hannover
Appelstr. 9a
D-30167 Hannover, Germany
Volker.paelke@ikg.uni-hannover.de
Abstract
In Mixed Reality (MR) interfaces spatial information
plays a central role. The integrated consideration of
dynamic, functional and spatial characteristics will be
essential for the creation of UIDLs for MR interfaces and
similar next-gen UIs.
Keywords
Mixed Reality, augmented reality, spatial information,
content models, UIDL
Introduction
Copyright is held by the author/owner(s).
CHI 2007, April 28 – May 3, 2007, San Jose, USA
ACM 1-xxxxxxxxxxxxxxxxxx.
Our experience with the development of Mixed-Reality
(MR) systems for a variety of domains (e.g. illustration,
entertainment, museums and navigation [2,3,4]) shows
that changing base-technologies make it hard to
maintain MR systems in a working state and prevent
reuse across applications. The description of MR
interfaces in an application and technology independent
model is a prerequisite for the development of generic
tools for content creation and management, and key to
the reuse of MR content across different hardware
platforms. One central feature that MR systems share
with many other types of novel interaction styles is the
focus on spatial content – in MR systems a spatial
model of both the virtual elements presented in the
interface as well as the real-world surroundings is
required. Similar requirements exist in tangible user
interfaces, 3D interaction and virtual reality systems
2
and spatial features are also a central characteristic of
the context exp[5]. loited in context-aware interfaces.
In recent work ([5]) we have examined the adaption of
existing spatial data-models for MR applications and
have identified a number of requirements for the
description of spatial MR content. One aspect that is
largely missing from current spatial data models is
dynamic behavior which is the defining element of
interactive user interfaces. UIDLs, on the other side,
have addressed UI specification from a high-level
functional perspective, focusing explicitly on the
specification of dynamic behavior. Given the historic
development of UIDLs from early UIMS for desktop
environments spatial characteristics have only been
considered to a limited extend, e.g. in the arrangement
of UI elements by automated layout planners. In MR
interfaces consideration of dynamic, functional and
spatial characteristics will be essential for the creation
of content models for future MR interfaces. For the
workshop we aim to identify a set of common
requirements for UIDLs that are relevant to a variety of
next generation user interfaces and hope that our
experience with MR and geoinformatics in general and
spatial data-models in particular will provide an
interesting supplement to current UIDL work.
Motivation
figure 1. Exemplary MR
applications using spatial
contents
Mixed-Reality (MR) integrate interactive computer
graphics into real-world environments. In contrast to
traditional computer graphics applications where the
complete graphics representation is generated from an
internal model, MR applications combine computer
generated graphics, audio and other multimedia
information with the information of the real-world
surroundings of the user. While MR may seem like the
ideal solution for many user interface problems the use
of mixed reality in real applications has been very
limited so far. In the past technological limitations
imposed by the available technologies (e.g. MR displays
with adequate resolution or precise spatial positioning
techniques) have been a major problem. However, with
maturing base technologies the question arises: why
are practical applications still largely missing?
One observation from our experience in MR projects is
that constantly changing base technologies (e.g.
graphics libraries, tracking systems and toolkits) make
the maintenance of working applications problematic
and prevent reuse of MR content. A central reason for
this is the integration of MR content with the application
implementation into a monolithic structure. While
libraries and toolkits are available to support MR
application developers there is currently no
standardized way to create, exchange and maintain MR
content. Maintaining the augmentation content in an
internal application specific data-structure was effective
and useful in early technology demonstrators, but it
now prevents the reuse of existing content in other
applications, requires reimplementation of MR
applications to adapt to new hardware platforms and
makes the creation of distributed multi-user
applications and application independent tools difficult.
As in other areas it would therefore be useful to
separate MR content description from the application
implementation. The description of MR content in an
application and technology independent model would
allow to create generic tools for content creation and
management, enable the reuse of MR content across
different hardware platforms or in different
applications, and support the shared simultaneous use
of an MR application by multiple users.
3
Requirements of MR-UIs
All mixed reality applications require some model of the
real physical surroundings (real world model) as well as
an internal representation of the augmentation
information (augmentation model). A common
approach in many MR demonstrator applications is to
represent the augmentation information by placing the
corresponding geometry in the scene-graph
datastructure of the underlying graphics system. The
real world model is often implicitly defined in the MR
program by tying information and actions to spatial
positions provided by tracking sensors. Implicit models
that combine real world models and augmentation
model have been defined as part of MR toolkits such as
DWARF [1], Studierstube [6] and I4D [3]. A problem
with implicit models is that they do not allow explicit
interaction with the model content, are often system
specific, not documented in detail and can change
between different versions of a system. The use of
explicit models has many advantages and enables the
creation of system independent authoring tools.
Static models alone are not sufficient to describe MR
interfaces. Dynamic content and behavior are central to
interactive applications and form a central part of MR
interfaces. In MR systems these dynamics can be
caused by changes in the environment as well as
explicit user actions. Similar to static content the
dynamic and interactive behavior of MR applications is
currently often defined implicitly in the system. To
enable effective creation of dynamic interactive MR
systems with tool support the explicit modeling of
dynamic system is required. UIDLs based approaches
form the basis for such solutions in a number of
domains. However, a direct transfer to MR interfaces is
difficult because in many cases the interaction logic can
not be considered separately from the real world and
augmentation models, e.g. in cases where the presence
of specific physical objects or the spatial locations of
the user are used to trigger actions in the system. We
therefore see a central challenge in the tight coupling
or direct integration of UIDL based dynamics
specifications with spatial content models for MR
systems. We believe that such an approach has
applications not only in a wide variety of MR systems
but that the integration of spatial aspects would also be
useful in related areas like tangible user interfaces and
context-aware interfaces.
References
[1] Bauer, M.; Bruegge, B.; Klinker, G.; MacWilliams,
A.; Reicher, T.; Riß, S.; Sandor, C.; Wagner, M.:
Design of a Component-Based Augmented Reality
Framework, in: Proc. ISAR 2001, October 2001.
[2] Brenner, C., Paelke, V., Haunert, J. & Ripperda, N.:
The GeoScope. In: UDMS'06, Proc. of the 25th Urban
Data Management Symposium, Aalborg, Denmark, May
2006
[3] Geiger, C.; Paelke, V.; Reimann, C.; Rosenbach,
W.: A Framework for the structured design of VR/AR
content. In: Proc. ACM VRST, 2000, Seoul, Korea.
[4] Paelke, V., Stöcklein, J., Reimann, C. & Rosenbach,
W.: Mixed Reality Authoring for Content Creators. In:
Proc. Simulation and Visualisation 2004, Magdeburg,
Germany, March 2004.
[5] Paelke, V.: Spatial Content Models for Augmented
and Mixed Reality. In: Proc. Augmented und Virtual
Reality in der Produktentstehung, Paderborn,
Germany, May 2007
[6] Schmalstieg, D.: Collaborative Augmented Reality,
Habilitation Thesis, Vienna University of Technology,
2001.
!
High-Level Descriptions for Multimodal
Interaction in Virtual Environments
!
!
34%2#(5&+6&*7*%#(
M,$$#.*!B'%9#+$%*=!
!"#$%&'$(
12+%$4+,=<,#3#+$U/2,$$#.*4>#!
VT:#+*%$#!K#'*+#!0(+!"%&%*,.!6#-%,!
!
,'-!*+,'$',*%(',.#!B'%9#+$%*#%*!
8-.*(9&0&'7*0(
W%<>/+&!
.(-#49,',13#'U/2,$$#.*4>#!
C#*#'$12,:$:,+3!N8!!
!
;QXO!"%#:#'>##38!Y#.&%/<!
:-&0(;*(<-*'7(
!
"#$%&'%'&!'(')*+,-%*%(',.!/$#+!%'*#+0,1#$!%$!,!
12,..#'&%'&!*,$3!0(+!-#$%&'#+$4!5%66%78!,!2%&2!.#9#.!
-#$1+%:*%('!0(+!;"!</.*%<(-,.!%'*#+,1*%('!%'!9%+*/,.!
#'9%+('<#'*$8!:+(9%-#$!,!<#,'$!*(!-#$%&'8!:+(*(*=:#!
(+!1(<</'%1,*#!,>(/*!%'*#+,1*%('!*#12'%?/#$4!72#!
0(1/$!%$!('!<,3%'&!%*!:($$%>.#!0(+!-#$%&'#+$!*(!1+#,*#!
'#@!%'*#+,1*%('!*#12'%?/#$!@2%.#!.(@#+%'&!
%<:.#<#'*,*%('!#00(+*$4!
J(,'4-#>(#13U/2,$$#.*4>#!
!
)&%20(3-020=(
3,+%'41('%'TU/2,$$#.*4>#!
)*+,-%.#(
!
6/.*%<(-,.8!A'*#+,1*%('!*#12'%?/#$8!2%&2).#9#.!
-#$1+%:*%('$!
!
!
/0$%-.1'$2-0(
B$#+!%'*#+0,1#!-#$%&'!0(+!'(')CA6D!%'*#+0,1#$!%$!,!+#,.!
12,..#'&#E!*((.$!0(+!*2#!1+#,*%('!(0!$/12!%'*#+0,1#$!,+#!
+,*2#+!$#.-(<!*2,'!1(<<('4!!F!;"!G</.*%<(-,.H!/$#+!
%'*#+0,1#!0(+!,!9%+*/,.!#'9%+('<#'*!'(*!('.=!1('$%$*$!(0!
*+,-%*%(',.!CA6D!%'*#+0,1#!#.#<#'*$4!72#!/$#+!,.$(!
'##-$!*(!>#!,>.#!*(!%'*#+,1*!@%*2!*2#!9%+*/,.!
#'9%+('<#'*!$/12!*2,*!2#I$2#!1,'!',9%&,*#!*2#!9%+*/,.!
@(+.-!(+!$#.#1*!,'-!<,'%:/.,*#!9%+*/,.!(>J#1*$4!
K(:=+%&2*!%$!2#.-!>=!*2#!,/*2(+I(@'#+G$H4!
KMA!NOOP8!F:+%.!Q!!RO8!NOOP8!S.(+#'1#8!A*,.=!
FK6!R)TTTTTTTTTTTTTTTTTT4 !
A'!(+-#+!*(!0,1%.%*,*#!*2#!-#$%&'!(0!*2#$#!</.*%<(-,.!
%'*#+,1*%('!*#12'%?/#$!2%&2).#9#.!-#$1+%:*%('$!2,9#!
>##'!-#9#.(:#-8!$/12!,$!A'7<.8!AKL!,'-!5%66%74!!
N!
!
A'!*2#!0(..(@%'&!$#1*%('$!@#!@%..!-%$1/$$!5%66%78!(/+!
(@'!2%&2).#9#.!-#$1+%:*%('8!0(..(@#-!>=!,'!#T,<:.#4!
D+(>.#<$!,'-!-+,@>,13$!(0!(/+!'(*,*%('!,+#!*2#+#,0*#+!
#.,>(+,*#-!('4!!
1,.1/.,*%'&!1(..%$%('$8!%'!(+-#+!0(+!*2#!-#$%&'#+!*(!#,$%.=!
:%13!*2#<!0+(<!,!.%$*4!S(+!$:#1%,.%$#-!,1*%('$8!2(@#9#+8!
1/$*(<!*,$3$!1,'!>#!@+%**#'!#%*2#+!/$%'&!WBF!$1+%:*!(+!
K^^!1(-#4!!
>2??2@(
5%66%7!,.$(!$/::(+*$!-,*,0.(@!>#*@##'!-%00#+#'*!*,$3$4!
W,>#.$!G2%&2!.#9#.!9,+%,>.#$H!,+#!/$#-!*(!$,9#!(/*:/*!
0+(<!,!*,$3!G(/*:/*!:(+*$!,+#!-#:%1*#-!,$!$<,..!$?/,+#$!
,*!*2#!>(**(<!+%&2*!(0!*2#!*,$3!$=<>(.H8!(+!*(!:+(9%-#!
%':/*!*(!,!*,$3!G%':/*!:(+*$!,+#!-#:%1*#-!,*!*2#!*(:).#0*!
(0!,!*,$3H4!
5%66%78!5(*,*%('!0(+!6/.*%6(-,.!A'*#+,1*%('!7#12'%?/#$8!
%$!,!&+,:2%1,.!'(*,*%('8!%'2#+%*%'&!*2#!0(+<,.%$<!(0!,!
$*,*#)12,+*!%'!(+-#+!*(!-#$1+%>#!</.*%<(-,.!%'*#+,1*%('!
@%*2%'!9%+*/,.!#'9%+('<#'*$4!S/+*2#+<(+#8!%*!,.$(!
$/::(+*$!-,*,0.(@!@2%12!%$!%<:(+*,'*!%'!*2#!/$#+!
%'*#+,1*%('8!,$!@#..4!F!<(+#!-#*,%.#-!-#$1+%:*%('!(0!
5%66%7!1,'!>#!0(/'-!%'!ZR[4!C#!$2(+*.=!-#$1+%>#!*2#!
<($*!%<:(+*,'*!:+%<%*%9#$!(0!5%66%74!F'!#T,<:.#!(0!,!
5%66%7!-%,&+,<!1,'!>#!$##'!%'!0%&/+#!R4!
5%66%7!%$!>,$%1,..=!,!$*,*#!12,+*8!%'!@2%12!,!$*,*#!
G+#:+#$#'*#-!,$!,!1%+1.#H!%'-%1,*#$!*2#!:($$%>.#!#9#'*$!
*2#!/$#+!1,'!:+(9%-#!,'-!*(!@2%12!*2#!,::.%1,*%('!
.%$*#'$4!!
F'!#9#'*!%$!,'!,1*%('!,!/$#+!1,'!:#+0(+<8!$/12!,$!
<(9%'&!,!:(%'*%'&!-#9%1#8!$:#,3%'&!,!1(<<,'-8!
1.%13%'&!,!>/**('8!#*14!C2#'!,'!#9#'*!(+!,!1(<>%',*%('!
(0!#9#'*$!2,$!(11/++#-8!*2#!,$$(1%,*#-!,++(@8!:(%'*$!*(!
,!*,$3)12,%'!G>%&!+#1*,'&.#$H!*2,*!%$!*(!>#!#T#1/*#-4!
F!*,$3)12,%'!%$!,!.%'#,+!$/11#$$%('!(0!*,$3$!*2,*!,+#!
#T#1/*#-!('#!,0*#+!*2#!(*2#+4!C2#'!,!*,$3)12,%'!%$!
0%'%$2#-8!,!$*,*#)*+,'$%*%('!(11/+$!G.%&2*8!&+##'8!,++(@H!
>+%'&%'&!*2#!%'*#+,1*%('!%'*(!,!'#@!$*,*#8!+#$:('-%'&!*(!
,'(*2#+!$#*!(0!#9#'*$4!
F!*,$3!G$<,..#+!+#1*,'&.#!%'!,!*,$3)12,%'H!%$!,!$#*!(0!
,1*%('$!-#0%'#-!*(!\+#,12!,!&(,.]4!7,$3$!,+#!<($*.=!
:+#-#0%'#-8!$/12!,$!?/#+=%'&!-#9%1#!:($%*%('$!,'-!
A'!(+-#+!*(!$/::(+*!*2#!#9,./,*%('!(0!%'*#+,1*%('!
*#12'%?/#$8!5%66%7!#<:.(=$!,!<#12,'%$<8!1,..#-!
:+(>#$4!G'(*!-#:%1*#-!%'!0%&/+#!RH!72#$#!,..(@!
<#,$/+%'&!*2#!-%00#+#'*!$*,*#$8!*+,'$%*%('$!,'-!.,>#.$!,*!
-%00#+#'*!<(<#'*$!-/+%'&!*2#!%'*#+,1*%('!ZN[4!72%$!
%'0(+<,*%('!1,'!0/+*2#+!>#!0%.*#+#-!,'-!.(&&#-!*(!-%$34!
_#1#'*.=8!@#!,.$(!,--#-!$/::(+*!0(+!1('*#T*/,.!,'-!
$#<,'*%1!%'0(+<,*%('!Z;[4!A'0(+<,*%('!0+(<!,'!
('*(.(&=8!-#$1+%>%'&!*2#!9%+*/,.!@(+.-8!1,'!>#!/$#-!@%*2!
*2%$!#T*#'$%('!(0!5%66%7!*(!-#0%'#!#T*+,!1('$*+,%'*$!
@%*2%'!,'!%'*#+,1*%('!*#12'%?/#4!S(+!#T,<:.#8!@%*2!,'!
%'*#+,1*%('!0(+!(:#'%'&!,!-((+8!('#!1,'!$:#1%0=!%'!*2#!
5%66%7!-%,&+,<!*2,*!*2#!(>J#1*!>#%'&!<,'%:/.,*#-!
</$*!>#!,!-((+8!@%*2(/*!2,9%'&!*(!2,+-)1(-#!*2%$!
1('$*+,%'*4!Y=!12,'&%'&!*2#!1('$*+,%'*8!@#!1,'!0(+!
%'$*,'1#!,.$(!/$#!*2%$!%'*#+,1*%('!*#12'%?/#!0(+!(:#'%'&!
@%'-(@$4!!
A=&6BC*(
F'!#T,<:.#!(0!5%66%7!-%,&+,<!+#:+#$#'*%'&!,!&+,>!
<#*,:2(+!%$!-#:%1*#-!%'!0%&/+#!R4!A'!*2%$!%'*#+,1*%('!
*#12'%?/#!*2#!/$#+!0%+$*!$#.#1*$!*2#!(>J#1*!*2+(/&2!,!
$#.#1*%('!*#12'%?/#4!F0*#+@,+-$!*2#!(>J#1*!<(9#$!
,11(+-%'&!*(!*2#!<(9#<#'*$!(0!*2#!:(%'*%'&!-#9%1#4!A0!
;!
!
*2#!/$#+!%$!$,*%$0%#-!@%*2!*2#!'#@!:($%*%('8!*2#!(>J#1*!
1,'!>#!+#.#,$#-!@%*2!,!>/**('!:+#$$4!!
.#/"%+08!@2%12!%'!%*$#.0!%$!-#0%'#-!>=!,!5%66%7!-%,&+,<4!
72#!+#$/.*!%$!$*(+#-!%'!*2#!.,>#.!/('()"(&*12()"4!F0*#+!
#T#1/*%('!(0!*2#!*,$3)12,%'!,!$*,*#!*+,'$%*%('!%$!
:#+0(+<#-!*(!*2#!3#+%45'#"%*+!$*,*#4!72%$!$*,*#!.%$*#'$!
*(!*@(!#9#'*$8!,!>/**('!:+#$$!,'-!,!<(9#!#9#'*!0+(<!,!
:(%'*%'&!-#9%1#4!72#!<(9#!#9#'*!*+%&&#+$!*2#!3*6(7
812()"!*,$3)12,%'!@2%12!<(9#$!*2#!$#.#1*#-!(>J#1*!
,11(+-%'&!*(!*2#!<(9#<#'*!(0!*2#!:(%'*#+!-#9%1#4!A0!
>/**('!9!%$!:+#$$#-!*2#!:(/('()"!*,$3)12,%'!%$!#T#1/*#-!
@2%12!-#$#.#1*$!*2#!$#.#1*#-!(>J#1*!,'-!#'-$!*2#!
%'*#+,1*%('!*#12'%?/#4!
!b*,*#!G$*,+*H!!!!
7,$3)12,%'!!!!
M%#+,+12%1,.!7,$3!!!!
G$%*0E$4#(
5%66%7!,..(@$!*2#!-#$%&'#+!*(!1(<</'%1,*#!,>(/*!*2#!
0/'1*%(',.%*=!(0!,'!%'*#+,1*%('!*#12'%?/#!*2+(/&2!,'!
#,$=)*()+#,-!-%,&+,<!@2%12!1,'!>#!*2#!>,$%$!0(+!
#T:.(+,*(+=!:+(*(*=:%'&!G@%*2!(*2#+!$*,3#2(.-#+$H4!
V9#'*!!!!
5%66%7!-%,&+,<$!1,'!>#!1+#,*#-!/$%'&!,!*((.8!1,..#-!
K(`#'AaV8!@2%12!1,'!$,9#!*2#!-%,&+,<$!%'!,'!T<.)
0(+<,*4!F$!*2%$!0(+<,*!1,'!>#!%'*#+:+#*#-!>=!(/+!
,::.%1,*%('!0+,<#@(+3!0(+!9%+*/,.!#'9%+('<#'*$8!@#!1,'!
$/::(+*!:+(*(*=:%'&!(0!</.*%<(-,.!%'*#+,1*%('!
*#12'%?/#$4!
W,>#.!!!!
D+#-#0%'#-!
7,$3!!!!
!
D2E1%*(F!F'!#T,<:.#!(0!,!5%66%7!"%,&+,<4!72%$!-%,&+,<!
+#:+#$#'*$!,!&+,>!A'*#+,1*%('!7#12'%?/#4!
72#!$*,+*!$*,*#!G!"#$"H!/$#$!*2#!%&'(!#9#'*8!@2%12!%$!
0%+#-!%<<#-%,*#.=4!72%$!*+%&&#+$!*2#!!('()"%*+!*,$3)12,%'!
@2%12!:#+0(+<$!,!$#.#1*%('!*#12'%?/#!',<#.=!,#-!
72#!1(<>%',*%('!(0!$*,*#!12,+*$!,'-!*2#!-,*,0.(@!
<#12,'%$<!%'!5%66%7!@(+3$!9#+=!@#..!0(+!-#$%&'%'&!
%'*#+,1*%('8!*=:%1,..=!-/+%'&!%'*#+,1*%('!*2#!/$#+!+#,12#$!
-%00#+#'*!%'*#+,1*%('!$*,*#$!,'-!:+(-/1#$!-,*,!,$!,!
+#$/.*!(0!%'*#+,1*%'&!@2%12!$(<#*%<#$!2,$!*(!>#!
*+,'$0#++#-!0(+!0/+*2#+!/$,&#4!
S%',..=8!2%#+,+12%1,.!+#/$#!(0!%'*#+,1*%('!*#12'%?/#$!%$!
$/::(+*#-4!F'!%'*#+,1*%('!*#12'%?/#!1,'!0/'1*%('!,$!,!
*,$38!,$!@#..8!,..(@%'&!*(!+#/$#!#,+.%#+!-#9#.(:#-!
%'*#+,1*%('!*#12'%?/#$4!
c!
!
H%-"C*6#(&0.(;%&,"&'7#(
G$&$1#(&0.(D1$1%*(I-%7(
VT:#+%#'1#!0+(<!/$%'&!5%66%7!2,$!>+(/&2*!/:!$(<#!
:+(>.#<$!,'-!-+,@>,13$4!C#!@%..!-%$1/$$!*2#<!%'!*2%$!
$#1*%('4!
A'!(/+!1/++#'*!0+,<#@(+3!,'-!<(-#.!>,$#-!
-#9#.(:<#'*!:+(1#$$!5%66%7!%$!1('*%'/(/$.=!>#%'&!
/$#-4!72#!*((.!K(`#'AaV!<,3#$!%*!:($$%>.#!*(!-#$%&'!
,'-!#T#1/*#!5%66%7!-%,&+,<$!%'*#+,1*%9#.=4!A'!*2#!
0/*/+#!@#!@(/.-!.%3#!*(!1('1#'*+,*#!('!*2#!:+(>.#<$!
,'-!-+,@>,13$!-%$1/$$#-!#,+.%#+8!%'!:,+*%1/.,+!2(@!*(!
$(.9#!*2#!$*,*#!#T:.($%('!:+(>.#<!#.#&,'*.=!,'-!
%'*/%*%9#!#++(+!2,'-.%'&4!72#!%'*#&+,*%('!(0!$#<,'*%1!
%'0(+<,*%('!%'!5%66%7!$##<$!*(!>#!,!9,./,>.#!0#,*/+#4!
A*!<%&2*!>#1(<#!#9#'!<(+#!>#'#0%1%,.!%0!,/*(<,*%1!,'-!
-=',<%1!1(/:.%'&!*(!:+#-#0%'#-!*,$3$!#T%$*$!%'$*#,-!(0!
2,9%'&!*(!/$#!:+#-#0%'#-!*,$3$!*(!%'*+(-/1#!$#<,'*%1$4!
A'!*2#!#T,<:.#!(0!0%&/+#!R!*2#!/$#+"#!0%+$*!*,$3!%$!*(!
:#+0(+<!,!$#.#1*%('!(0!,'!(>J#1*4!A0!*2#!/$#+!-(#$!'(*!
$/11##-!%'!$#.#1*%'&!,'!(>J#1*8!,!+(..>,13!<#12,'%$<!%$!
#T#1/*#-!*(!/'-(!*2#!$*#:$!,.+#,-=!*,3#'4!S(+!*2%$!
:/+:($#8!#9#+=!*,$3!1('*,%'$!%'0(+<,*%('!'##-#-!*(!>#!
,>.#!*(!$/':#+0(+<"!%*$#.08!>/*!/'0(+*/',*#.=!*2%$!%$!'(*!
,.@,=$!9#+=!$*+,%&2*0(+@,+-4!S(+!#T,<:.#!%0!*2#!/$#+!
-#.#*#-!,'!(>J#1*!-/+%'&!,!:2=$%1,.!$%</.,*%('8!*2#!
#'*%+#!$%</.,*%('!@(/.-!2,9#!*(!<(9#!>,13!%'!*%<#4!
F.*#+',*%9#.=8!*2#!-#$%&'#+!1(/.-!%'-%1,*#!2(@!*(!2,'-.#!
*2#!#++(+$!(+!%'*+(-/1#!,!1,'1#..,*%('!:+(1#$$8!>/*!*2%$!
@(/.-!%'1+#,$#$!*2#!-#$%&'!1(<:.#T%*=4!
S(+!1(<:.#T!%'*#+,1*%('!*#12'%?/#$8!/"#"(7(;4'*/%*+8!,!
1(<<('!:+(>.#<!%'!$*,*#!-%,&+,<$8!1,'!(11/+4!!F!
$(./*%('!0(+!*2%$!:+(>.#<!%$!'(*!$*+,%&2*0(+@,+-!,'-!@#!
,+#!#T:.(+%'&!*2#!:($$%>%.%*=!(0!/$%'&!:+#1('-%*%('$!,*!,!
*,$3)12,%'!(+!#9#'*!.#9#.4!
!'70-,C*.E*6*0$#(
D,+*!(0!*2#!+#$#,+12!,*!*2#!VT:#+*%$#!K#'*+#!0(+!"%&%*,.!
6#-%,!%$!0/'-#-!>=!*2#!V_"S!GV/+(:#,'!_#&%(',.!
"#9#.(:<#'*!S/'-H8!*2#!S.#<%$2!`(9#+'<#'*!,'-!*2#!
S.#<%$2!A'*#+-%$1%:.%',+=!%'$*%*/*#!0(+!Y+(,->,'-!
7#12'(.(&=!GAYY7H4!72#!a_)"#6(!:+(J#1*!GAC7!O;ONcPH!
%$!-%+#1*.=!0/'-#-!>=!*2#!AC78!,!S.#<%$2!$/>$%-=!
(+&,'%d,*%('4!
5*J*%*0'*#(
F'(*2#+!,$:#1*!*2,*!'##-$!<(+#!,**#'*%('!%$!*2#!
%'1(+:(+,*%('!(0!(/*:/*!<(-,.%*%#$8!$/12!,$!2,:*%1$!,'-!
,/-%(4!!S(+!'(@8!*2#$#!,+#!,.@,=$!,--#-!%'!,!-%,&+,<!
*2+(/&2!*2#!,--%*%('!(0!,!1/$*(<!*,$3!@2%12!%$!$1+%:*#-!
(+!1(-#-4!72/$8!%0!,!-#$%&'#+!@(/.-!.%3#!*(!,--!,!1#+*,%'!
0(+1#!0##->,13!#00#1*!-/+%'&!,!<,'%:/.,*%('!%'*#+,1*%('!
*#12'%?/#8!,!1/$*(<!*,$3!2,$!*(!>#!-#$%&'#-!@2%12!
1+#,*#$!*2#!,::+(:+%,*#!0(+1#!0##->,134!F!>#**#+!
,::+(,12!(0!,--%'&!$/12!*=:#$!(0!0##->,13!%$!'#1#$$,+=!
,$!9%$/,.!0##->,13!%$!1/++#'*.=!>#**#+!1(9#+#-!>=!
%&'(')*+'(,-.#/#,#012,.#,$2*324*32-567'1-#"8!
[1] "#!Y(#138!e48!a,',13#'8!"48!_,=<,#3#+$8!K4!,'-!
K('%'T8!f4!M%&2)W#9#.!6(-#.%'&!(0!6/.*%<(-,.!
A'*#+,1*%('!7#12'%?/#$!B$%'&!5%66%78!e(/+',.!(0!
a%+*/,.!_#,.%*=!,'-!Y+(,-1,$*%'&<!cgN8!'('):#+%(-%1,.8!
Abb5!RPhO)NO;i47
[2] K('%'T8!f48!K/::#'$8!V48!"#!Y(#138!e4!,'-!
_,=<,#3#+$!K4!A'*#&+,*%'&!$/::(+*!0(+!/$,>%.%*=!
#9,./,*%('!%'*(!2%&2!.#9#.!%'*#+,1*%('!-#$1+%:*%('$!@%*2!
5%66%78!"baAb!NOOh8!W5Kb8!9(./<#!c;N;8!::4!XQ)ROP7
[3] a,',13#'8!W48!_,=<,#3#+$8!K4!,'-!K('%'T8!f4!
A'*+(-/1%'&!b#<,'*%1!A'0(+<,*%('!-/+%'&!K('1#:*/,.!
6(-#..%'&!(0!A'*#+,1*%('!0(+!a%+*/,.!V'9%+('<#'*$8!6AbA!
NOOi!GCb!,*!AK6A!NOOiH!G%'!:+#$$H47
!"#$%&'(')*+,+-'.+/+012+,345$2*,+-'6. '
457+,8#9+':+2,+$+57#7*15
.7&-+2'80!90:24;%&<)
!!"#$%&'$
<'%&'(!)/(!P3,*%!</,-3&'(!
"#!$%&'()*+'!$,-.','%&*&$/%!$0!+/,-.'1!*%2!,*2'!,/('!
2$))$+3.&!4$&5!'*+5!*22$&$/%*.!)'*&3('6!7)&'(!$%8'0&$9*&$%9!
&5'!'%8$0$/%'2!:'5*8$/(0!/)!"#!$%&'()*+'!2'8'./-'(0!$%!
.*%93*9';!*(&$)*+&0!*%2!$%&'(8$'40;!*!&$'('2!
('-('0'%&*&$/%!:*0'2!3-/%!2'8'./-'(!*(&$)*+&0!*%2!
.*%93*9'!4*0!+('*&'26!<5*0,!*../40!)/(!2'8'./-,'%&!
*0!&$'('2!+/5'0$8'!!"#!$%&'!*%2!&5'!'1'+3&$/%!*0!)./40!
0$,$.*(!&/!2'8'./-'(!'%8$0$/%'2!:'5*8$/(6!<5*0,!5*0!
:''%!30'2!:=!,3.&$-.'!2'8'./-'(0!$%!+*0'!0&32$'06
?%&'(*+&$/%
T$(9$%$*!I'+5
EEUE!V(*)&!#($8'
VW??!X3$.2$%9!KUYUZL
X.*+[0:3(9;!T7!E\UZY!>D7
+4$%9(*8]8&6'23
()*+,%-#
"#!$%&'(*+&$/%;!>0'(!?%&'()*+'!#'0+($-&$/%!@*%93*9';!
A/2'.B#($8'%!C%9$%''($%9;!>0'(!?%&'()*+'!A*%*9','%&!
D=0&',
!./0.1&##232'&$2,40()*+,%-#
#6E6E!FD/)&4*('!C%9$%''($%9GH!#'0$9%!I//.0!*%2!
I'+5%$J3'0BBB</,-3&'(B*$2'2!0/)&4*('!'%9$%''($%9!
K<7DCL;!M:N'+&B/($'%&'2!2'0$9%!,'&5/20;!>0'(!
$%&'()*+'0O!P6Q6E6!F?%)/(,*&$/%!?%&'()*+'0!*%2!
R('0'%&*&$/%G!>0'(!?%&'()*+'0BBB!S(*-5$+*.!>0'(!
?%&'()*+'06
54$%,-6'$2,4
</-=($95&!$0!5'.2!:=!&5'!*3&5/(^/4%'(K0L6
<P?!EUU_;!7-($.!E`!a!A*=!";!EUU_;!D*%!b/0';!>D7
7<A!YB1111111111111111116
?!*,!%/&!$%&'('0&'2!$%!+/%&'%&!+('*&$/%;!:3&!$%&'(*+&$/%!
+('*&$/%6!I5$0!'%&*$.0!:'5*8$/(0!$%!&5'!'%8$(/%,'%&;!/)!
&5'!30'(!/(!/)!&5'!$%&'()*+';!:(/395&!*:/3&!:=!+5*%9'0!
:=!&5'!30'(;!'%8$(/%,'%&;!$%&'()*+'!*%2!&$,'6!7.0/;!?!
*,!%/&!$%&'('0&'2!$%!(*-$2!-(/&/&=-$%9!/)!*!0$,-.'!
$%&'()*+'6!W5*&!?!*,!$%&'('0&'2!$%!$0!*!('-('0'%&*&$/%!
*../4$%9!)/(!2''-!*++'00!&/!$%&'(*+&$/%!2'&*$.0!*%2!)/(!
:(/*2!$2'*0!&/!:'!('-('0'%&*:.'6!7!('-('0'%&*&$/%!
45'('!%'4!$2'*0!+*%!:'!+('*&'2!*%2!'1-('00'2;!
-/00$:.=!'8'%!$%0-$('2;!0$,-.=!:=!4/([$%9!$%!&5'!
('-('0'%&*&$/%6!?%&'()*+'0!5*8'!:''%!0&32$'2!*%2!,*%=!
$2'*0!'1-./('2!$%!&5'!)$'.2!/)!"#!$%&'(*+&$/%6!#'0-$&'!
&5$0;!&5'('!$0!*!.*+[!/)!*!&(3.=!$%&'(*+&$8'!'1-'($'%+'0!
45'('!&5'!:'%')$&!$0!23'!&/!&5'!$%&'(*+&$/%!*%2!%/&!&5'!
8$03*.!'1-'($'%+'!FYG6
W5$.'!"#!$%&'()*+'0!5*8'!*.4*=0!:''%!5*(2!&/!
$,-.','%&!FEG;!&5$0!2/'0!%/&!'1-.*$%!45=!*22'2!
)'*&3('0!*('!2$0-(/-/(&$/%*&'.=!,/('!+/,-.$+*&'2!&/!
$,-.','%&6!P/4'8'(;!.//[$%9!,/('!+./0'.=!*&!5/4!"#!
$%&'()*+'0!*28*%+';!4'!0''!&5'!)/../4$%9H!C*+5!
*22$&$/%*.!)'*&3('!$%+('*0'0!&5'!*+&$/%0!*%2!0&*&'!/)!&5'!
30'(;!'%8$(/%,'%&!*%2!$%&'()*+'!&/!:'!+/%0$2'('2!23($%9!
$,-.','%&*&$/%6!D$%+'!'*+5!*+&$/%!$0!+/%0$2'('2!$%!
('9*(2!&/!'*+5!0&*&';!.$%'*(!9(/4&5!/)!*+&$/%0!*%2!0&*&'!
('03.&!$%!%/%B.$%'*(!9(/4&5!/)!$,-.','%&*&$/%!
+/,-.'1$&=6!
#'0-$&'!&5$0!+/,-.'1$&=!9(/4&5;!"#!$%&'(*+&$/%!
:'5*8$/(0!*('!('*2$.=!'%8$0$/%'2!:=!"#!$%&'()*+'!
2'8'./-'(0!6!I5'!-(/:.',!$0!&5'!2$))$+3.&=!2'8'./-'(0!
5*8'!$,-.','%&$%9!&5'0'!$2'*06!@/9$+*..=!&5'%;!*%!
$,-.','%&*&$/%!:*0'2!/%!2'8'./-'(c0!'%8$0$/%'2!
:'5*8$/(!/)!"#!$%&'()*+'0!4/3.2!0+*.'!*&!.'*0&!*0!4'..!*0!
2'8'./-'(!3%2'(0&*%2$%96!I5(/395!*%!$%8'0&$9*&$/%!/)!
2'8'./-'(!*(&$)*+&0;!$%&'(8$'40!*%2!.*%93*9'!K+/..'+&'2!
:=!2'8'./-'(0!('('-('0'%&$%9!8$2'/!/)!"#!$%&'(*+&$/%!
&'+5%$J3'0!*0!.*%93*9'L;!*!.$0&!/)!)$8'!-(/:.',0!)*+$%9!
2'8'./-'(0!4*0!+('*&'2!*%2!2$0+300'2!$%!F\GH
!
@$,$&'2!>%2'(0&*%2$%9
!
#$0&*%+'!/)!A*--$%9
!
</,-.'1$&=
!
d'$,-.','%&*&$/%!/8'(!d'30'
!
P*(2!*%2!X(/*2!R(/:.',0
I5'!+('*&'2!2'8'./-,'%&!('-('0'%&*&$/%!:*0'2!3-/%!
2'8'./-'(!*(&$)*+&0!5*0!:''%!+*..'2!</%+'-&BM($'%&'2!
#'0$9%!*%2!$,-.','%&'2!$%!*!0=0&',!+*..'2!<5*0,!F\G6!
<5*0,!$0!%*,'2!)/(!$&0!*:$.$&=!&/!:($29'!&5'!2$8$2'!
:'&4''%!2'8'./-'(c0!'%8$0$/%'2!:'5*8$/(!*%2!&5'!
,*+5$%'!45$+5!(3%0!$&6!</%+'-&BM($'%&'2!#'0$9%!K<M#L!
5*0!*0!$&0!3%$&!/)!$,-.','%&*&$/%!*!+/%+'-&6!?%!<5*0,;!
*!+/%+'-&!$,-.','%&0!*!0$%9.'!+/5'0$8'!+/%+'(%!$%!)/3(!
&$'(0!/)!2'8'./-'(!3%2'(0&*%2$%9;!05/4%!$%!e$93('!Y6!
C*+5!03++'00$8'!&$'(!('-('0'%&0!*!2$))'('%&!&=-'!/)!
$%)/(,*&$/%6!7&!(3%&$,';!&5'!<5*0,!0=0&',!$%&'(.'*8'0!
&5'!+/5'0$8'!)3%+&$/%*.$&=!4$&5!/&5'(!+/%+'-&0!)/(!&5'!
2'0$('2!'%8$0$/%'2!:'5*8$/(6!<5*0,!5*0!:''%!30'2!$%!
,3.&$-.'!2'8'./-,'%&!+*0'!0&32$'0!F\G6
:5/0=>&?@1)
7%!'1*,-.'!$,-.','%&*&$/%!/)!*!,/2'(*&'.=!
+/,-.$+*&'2!"#!$%&'(*+&$/%!&'+5%$J3';!&5'!4/(.2B$%B
,$%$*&3('!KW?AL!F"G;!05/40!&5'!:'%')$&0!/)!*!<5*0,!
('-('0'%&*&$/%6!I5'!2'8'./-'(c0!'%8$0$/%'2!:'5*8$/(!/)!
&5'!W?A!&'+5%$J3'!,$95&!:'!2'0+($:'2!*0H!f7!&(*+['(g0!
,/8','%&!+*30'0!&5'!8$(&3*.!5*%2!&/!,/8'6!W5'%!*!
:3&&/%!$0!-('00'2;!+5'+[!)/(!0'.'+&$/%!/)!*%!/:N'+&!$%!&5'!
W?A!K*!-(/1=!/:N'+&L6!?)!*%!/:N'+&!$0!0'.'+&'2;!,/8'!&5'!
)3..B0+*.'!/:N'+&!$%!&5'!'%8$(/%,'%&!*0!&5'!-(/1=!/:N'+&!
,/8'06!W5'%!&5'!:3&&/%!$0!('.'*0'2;!('.'*0'!&5'!-(/1=!
/:N'+&!)(/,!&5'!30'(c0!5*%2!*%2!0&/-!,/8$%9!&5'!)3..B
0+*.'!/:N'+&6h!I5$0!2'0+($-&$/%!$0!05/4%!*0!*!)./4!*&!&5'!
&/-!/)!e$93('!E6!
e$93('!YH!I5'!)/3(!&$'(0!/)!<5*0,!('-('0'%&!2'8'./-'(!
3%2'(0&*%2$%9!*0!&5'=!$,-.','%&!*!"#!$%&'()*+';!05/4%!5'('!
)/(!&5'!W?A!F"G!&'+5%$J3'6!#'+/,-/0$%9!2'8'./-,'%&!$%!&5$0!
4*=!+('*&'0!+/5'0$8'!+/%+'-&0!45$+5!*('!'1'+3&'2!:=!<5*0,!
*0!)./40!/)!'8'%&0!*0!05/4%!$%!e$93('!E6
I5/395!&5$0!)./4!'1-.*$%0!&5'!W?Ac0!)3%+&$/%$%9;!$&!$0!
.$,$&'2!$%!$&0!3&$.$&=!)/(!2'8'./-,'%&6!e$(0&;!&5'!)./4!2/'0!
%/&!0'-*(*&'!$%&/!+/5'0$8'!+/%+'(%0!&/!0$,-.$)=!
2'8'./-,'%&6!e/(!'1*,-.';!&5'!:3&&/%c0!(/.'0!*('!
0-('*2!*+(/00!&4/!2$0&*%&!0&'-0!KE!*%2!ZL6!D'+/%2;!
&5'('!$0!%/!+.'*(!2$8$0$/%!:'&4''%!*!2$0+('&'!'8'%&!*%2!
*!+/%&$%3/30!('.*&$/%05$-6!e/(!'1*,-.';!,/8$%9!&5'!
5*%2!*%2!&5'!)3..B0+*.'!/:N'+&!K0&'-0!Y!*%2!QL!*('!
+/%&$%3/30!45$.'!+5'+[$%9!)/(!0'.'+&$/%!$0!2$0+('&'!K0&'-!
"L6!I5$(2;!$,-.','%&*&$/%!/)!&5'!)./4!('J3$('0!*!
+/,-.$+*&'2!2'8'./-'(!,'%&*.!,/2'.!$%!/(2'(!&/![''-!
e$93('!EH!7!)./4!K&/-L!/)!&5'!W?A!&'+5%$J3';!30')3.!
3%2'(0&*%2$%9!5/4!&5'!$%&'(*+&$/%!-(/9('00'0;!$%8/.8'0!,*%=!
2$))'('%&!$2'*0!45$+5!&5'!2'8'./-'(!5*0!&/!,*$%&*$%!23($%9!
$,-.','%&*&$/%6!I5'!-(/+'00$%9!/)!'8'%&0!K:/&&/,L!$%!<5*0,!
$%&'9(*&'0!&5'!:'5*8$/(0!/)!,3.&$-.'!+/%+'-&0!*0!&5'!)./4!3-!&/-!
:3&!*../40!2'8'./-,'%&!*0!+/5'0$8'!$2'*0!Ke$93('!YL6!I5'!
%3,:'(0!/%!&5'!($95&!+/%%'+&!'8'%&0!&/!&5'!0&'-0!3-!&/-6!
*..!('J3$(','%&0!/)!&5'!5*%2;!:3&&/%;!0'.'+&$/%!
&'+5%$J3'!*%2!/:N'+&B&/B/:N'+&!('.*&$/%05$-0!+/%0$0&'%&6
?%!+/%&(*0&;!*!<5*0,!('-('0'%&*&$/%!$0!:*0'2!/%!
*:0&(*+&$/%0!$%!+/%8'(0*&$/%*.!2/,*$%!.*%93*9'6!W?A!
4*0!2'0+($:'2!:=!D&/*[.'=!'&6!*.6!$%!&5'!)/../4$%9!&4/!
0'%&'%+'0;!f7!W/(.2B$%BA$%$*&3('!$0!*!5*%2B5'.2!
,$%$*&3('!"#!,*-6!W5'%!&5'!30'(!,*%$-3.*&'0!/:N'+&0!
$%!&5'!,*-;!&5'!+/(('0-/%2$%9!/:N'+&0!0$,3.&*%'/30.=!
3-2*&'!$%!&5'!)3..!0+*.'!8$(&3*.!('*.$&=h!F"G6!i/&$+'!5/4!
&5'0'!&4/!0'%&'%+'0!0$,-.=!2'0+($:'!*!+/,-.'1!
$%&'(*+&$/%!&'+5%$J3'!:=!2'-'%2$%9!/%!*:0&(*+&$/%0!$%!
2/,*$%![%/4.'29'6
I5'!W?A!$,-.','%&*&$/%!$0!0-('*2!*+(/00!)/3(!&$'(0!/)!*!
<5*0,!+/%+'-&!05/4%!$%!e$93('!Y6!e$(0&;!&5'!.*%93*9'!
30'2!&/!2'0+($:'!&5'!W?A!$0!-.*+'2!$%!&5'!C%8$0$/%'2!
X'5*8$/(!I$'(6!I5$0!$%)/(,0!&5'!2''-'(!&$'(0!4$&5!*!5$95B
.'8'.;!'*0$.=!3%2'(0&//2;!.*%93*9'!2'0+($-&$/%6!I5'!
.*%93*9'!2'0+($-&$/%!$,-.$'0!&5*&!&5'!8$(&3*.!5*%2!KTPL!
*%2!R/0$&$/%!d'.*&$/%05$-!KRdL!+/%+'-&0;!-('8$/30.=!
$,-.','%&'2!$%!/&5'(!0=0&',0;!*('!('30*:.'!5'('6!
D'+/%2;!$%!&5'!<*30*.$&=!I$'(!&5'!W?A!&'+5%$J3'!+('*&'0!
+*30*.!('.*&$/%05$-0!&/!&5'!TP!+/%+'-&c0!0&*&'0!&/!
+/%%'+&!&/!$&0!)3%+&$/%*.$&=6!I5$(2;!&5'!73&/,*&*!I$'(!
)/(,0!$&0!+30&/,!0&*&'!,*+5$%'!('-('0'%&*&$/%!30$%9!
&5'!*+&$/%0!+*30'2!:=!&5'!TP!+/%+'-&!*%2!&4/!+('*&'2!
0&*&'0H!,*%$-3.*&$%9!K,/8$%9!*!-(/1=!/:N'+&L!*%2!
0&/--'2!K%/&!,/8$%9!&5'!-(/1=!/:N'+&L;!&5*&!('-('0'%&!
&5'!W?Ac0!)./4!/)!:'5*8$/(!*0!2'0+($:'2!:=!D&/*[.'=!'&6!
*.6c0!0'+/%2!0'%&'%+'6!e$%*..=;!&5'!</2'!I$'(!$,-.','%&0!
$%!<jj!&5'!:'5*8$/(!45'%!&5'0'!&4/!0&*&'0!*('!
'%&'('26!I5'!,*%$-3.*&$%9!0&*&'!+('*&'0!&5'!Rd!&/!
3-2*&'!&5'!)3..B0+*.'!/:N'+&;!*%2!&5'!0&/--'2!0&*&'!
2'.'&'0!&5'!Rd6
$%+('*0'2!:=!&5'!&$'('2!('-('0'%&*&$/%;!*0!2'8'./-'(0!
[%/4!45'('!&/!.//[!)/(!&5'!3%2'(0&*%2$%9!&5'=!%''26!
0,)%.,!,&/!$0!+('*&'2!:=!2$8$2$%9!'%8$0$/%'2!:'5*8$/(!
*./%9!.*%93*9'!*:0&(*+&$/%0!*%2!('23+$%9!&5'!0+/-'!/)!
2'8'./-'(!,'%&*.!,/2'.06!1$2'3-,.,&/!/++3(0!&5(/395!
+/5'0$8'!+/%+'-&!45$+5!*('!+/,-/0*:.'!&/!)/(,!5$95'(!
.'8'.0!/)!,'*%$%96!45".53-,.,&/!/)!0=0&',0!$0!*+5$'8'2!
:=!04*--$%9!&5'!+*30'!/)!*!0&$,3.30!/(!('0-/%0'!$%!&5'!
<*30*.$&=!I$'(6!6"*&3-,.,&/!&/!%'4!-.*&)/(,0;!
'%8$(/%,'%&0;!*%2!&//.[$&0!$0!.'00!/%'(/30;!*0!/%.=!&5'!
</2'!I$'(!%''20!('$,-.','%&*&$/%6!
I5$0!$,-.','%&*&$/%!$0!'1'+3&'2!$%!<5*0,!*0!05/4%!*&!
&5'!:/&&/,!/)!e$93('!E!*0!+*0+*2'0!/)!'8'%&06!I5'!
$,-.','%&*&$/%!$0!+/5'0$8'!*0!$%!e$93('!Y!*%2!&5'!)./4!
$0!$%&'(.'*8'2!:=!<5*0,!&/!,*&+5!&5'!)./4!*&!&5'!
:/&&/,!/)!e$93('!E6!722$&$/%*.!)'*&3('0!&/!&5'!W?A;!
03+5!*0!&5'!0+*.$%9!*%2!0+(/..$%9!/)!&5'!DDW?A!
&'+5%$J3'!FQG;!5*8'!)3..!*++'00!&/!&5'!$%&'(%*.0!/)!&5'!
W?A!$,-.','%&*&$/%!:=!('0-/%2$%9!2$('+&.=!&/!&5'!
0&*&'0!$%!$&0!)./46
E)3)%)4')#
.7&#?A#0B,4CD64'$2,4&10E)F62%)?)4$#
7%!'8*.3*&$/%!/)!<5*0,!*./%9!%/%B)3%+&$/%*.!
('J3$(','%&0!05/40!$&0!3&$.$&=6!(")%*$+$#',-,.,&/!$0!
.,4'16-24;0E)?&%8#
7!:($')!$%&(/23+&$/%!/)!&5'!2'8'./-,'%&!/)!<5*0,!
2$0+300'2!$&0!/($9$%0!$%!2'8'./-'(!('-('0'%&*&$/%06!I5'!
W?A!&'+5%$J3'!$%!<5*0,!05/40!$,-.','%&*&$/%!*0!
+/5'0$8'!$2'*0!*%2!'1'+3&$/%!*0!)./40!0$,$.*(!&/!
2'8'./-'(!3%2'(0&*%2$%96!i/%B)3%+&$/%*.!('J3$(','%&0!
05/4!&5'!3&$.$&=!/)!&5'!*--(/*+5!4$&5!F\G!(')'('%+'2!)/(!
*!)3..!2$0+300$/%!/)!&5'!8*.$2*&$/%!/)!<5*0,!$%!,3.&$-.'!
+*0'!0&32$'0!4$&5!,3.&$-.'!2'8'./-'(06
FYG! X(//[0;!e6!W5*&c0!d'*.!*:/3&!T$(&3*.!d'*.$&=k!?%!
(")%2&$*78*3%+,!'73#97:%%.,!3&,"#'!Yl;Z!KYlllL6
FEG! P'(%2/%;!V6!R6;!!T*%!#*,;!76!*%2!S.'$+5'(;!A6!
W/([05/-!/%!&5'!<5*..'%9'0!/)!"#!?%&'(*+&$/%6!0;8(<; 7
=2..$&,#;!EZ;!\;!KYll\L6
F"G D&/*[.'=;<6!*%2!R*30+5;!d6!T$(&3*.!d'*.$&=!/%!*!
W?AH!?%&'(*+&$8'!W/(.20!$%!A$%$*&3(';!(<;;!KYllQL6
F\G! W$%9(*8';!<6!76;!X/4,*%;!#6!76!I$'('2!#'8'./-'(B
<'%&($+!d'-('0'%&*&$/%0!)/(!"#!?%&'()*+'0H!</%+'-&B
M($'%&'2!#'0$9%!*0!<5*0,6!D3:,$&&'2!&/!?CCC!Td!EUU`6
FQG W$%9(*8';!<6;!P*+$*5,'&/9.3;!m6!*%2!X/4,*%;!#6!
M8'(+/,$%9!W/(.2!$%!A$%$*&3('!@$,$&*&$/%0!:=!*!D+*.'2!
*%2!D+(/..$%9!W?A6!>?7@'$*7;#&$*A3!$'B!KEUUZL6
Download