This document has been extracted (pages 13

advertisement
This document has been extracted (pages 13-26) from the following report:
D03 - Interaction Requirements
Specification
Project Number:
Project Title:
Deliverable Type:
IST 20656 0
PALIO
Report
Deliverable Number:
Contractual date of Delivery to the CEC:
Actual date of Delivery to the CEC:
Title:
Work-package contributing to the deliverable:
Nature of Deliverable:
D03
//
//
Interaction requirements specification
WP1
IST
1. Device-oriented considerations
1.1. Introduction
The main concern in this kind of interaction is functionality. It is useless to provide a wide range of
services, options and choices if the functionality of the devices used is not adequate.
1.2. Interaction through large-screen devices
This form of interaction is the most appropriate for the presentation of detailed information to a user
that experiences no visual problems. The provision of tourist information based on a map is
significantly accommodated on large screens.
Another important issue is the support of interaction through a lot of robust and ergonomically
designed input and output components, e.g. keyboards, touch screens, etc.
Furthermore, disabled persons can use special I/O equipment that can help them overcome a
particular handicap. In the following table item categories relevant to supporting accessible
interactions, regarding the types of disabled users under consideration, are briefly elaborated and
presented.
1. Computers
1.1. Hardware
1.1.1. Central Processors
1.1.1.1. Computers for physical therapy case management
1.1.1.2. Computer with wheelchair interface
1.1.1.3. Voice output computers
1.1.2. Input
1.1.2.1. Braille input interfaces
1.1.2.1.1. Braille computer terminal
1.1.2.1.2. Braille keyboard
1.1.2.1.3. Braille keytop overlay
1.1.2.1.4. Braille note writer
1.1.2.1.5. Braille word processor
1.1.2.2. General input interfaces
1.1.2.2.1. Abbreviation expansion storage device
1.1.2.2.2. Computer interface for low vision reading systems
1.1.2.2.3. Computer terminal control interface
1.1.2.2.4. Control input interface
1.1.2.2.5. Cursor control interface
1.1.2.2.6. Eye controlled input system
1.1.2.2.7. Joystick input interface
1.1.2.2.8. Mouthstick control of cursor control device
1.1.2.2.9. Touchpad
1.1.2.2.10. Touchscreen
1.1.2.2.11. Touch switch for cursor
1.1.2.3. Keyboards
1.1.2.3.1. Audible keyboard interface
1.1.2.3.2. Auto repeat function disabler
1.1.2.3.3. Braille keyboard
1.1.2.3.4. Braille keytop overlay
1.1.2.3.5. Custom keyboard
1.1.2.3.6. Expanded keyboard
1.1.2.3.7. Expanded keyboard modification program
1.1.2.3.8. Expanded keyboard overlay
1.1.2.3.9. Expanded keypad
1.1.2.3.10. Keyboard for specific software
1.1.2.3.11. Keyboard interface
1.1.2.3.12. Keyboard moisture guard
1.1.2.3.13. Keyboard wrist support
1.1.2.3.14. Keyguard
1.1.2.3.15. Keylock
1.1.2.3.16. Keytop overlay
1.1.2.3.17. Mini keyboard
1.1.2.3.18. Modified keyboard
1.1.2.3.19. Mouse emulator
1.1.2.3.20. One hand keyboard interface
1.1.2.3.21. One hand keyboard interface modification
1.1.2.3.22. Remote keyboard
1.1.2.3.23. Scanning keyboard
1.1.2.3.24. Single switch keyboard modification
1.1.2.3.25. Single switch keyboard modification program
1.1.2.3.26. Slant board
1.1.2.3.27. Track ball
1.1.2.4. Voice input interfaces
1.1.2.4.1. Voice input control switch
1.1.2.4.2. Voice input evaluation system
1.1.2.4.3. Voice input terminal
1.1.2.4.4. Voice input terminal control interface
1.1.3. Modems
1.1.3.1. Audio output for data transmission
1.1.4. Output
1.1.4.1. Braille output
1.1.4.1.1. Braille display processor
1.1.4.1.2. Braille printer
1.1.4.1.3. Braille translator
1.1.4.1.4. Paperless Braille output module
1.1.4.2. Large print output
1.1.4.3. Printout accessories
1.1.4.4. Voice output
1.1.4.4.1. Computer headphone set
1.1.4.4.2. Keypad for speech synthesiser
1.1.4.4.3. Speech synthesiser
1.1.4.4.4. Speech synthesiser control switch
1.1.4.4.5. Voice output computer
1.2. Software
1.2.1. Computer Access interfaces
1.2.1.1. Motor impairment access
1.2.1.1.1. Keyboard controller simulation program
1.2.1.1.2. Keyboard modification program
1.2.1.1.3. Mouse emulator program
1.2.1.1.4. Single switch keyboard modification program
1.2.1.2. Sensory impairment access
1.2.1.2.1. Braille format program
1.2.1.2.2. Large print program
1.2.1.2.3. Visual beep indicator program
1.2.1.2.4. Voice input control switch
1.2.1.2.5. Voice output program
1.2.1.2.6. Voice output recognition training program
1.2.1.2.7. Voice output screen review program
1.3. Computer Accessories
1.3.1. Mouthstick control of cursor control device
2. Communication
2.1. Writing
2.1.1. Braille
2.1.2. Braille writers
2.1.3. Writing general
2.2. Mouthstick
2.2.1. Mouthstick
2.2.2. Mouthstick control of cursor control device
2.2.3. Mouthstick holder
2.2.4. Mouthstick kit
2.2.5. Optical mouthstick holder
2.2.6. Optical pointer
2.3. Headwands
2.3.1. Headwand
2.3.2. Optical pointer
3. Sensory disabilities
3.1. Blind and low vision
3.1.1. Television screen magnifier
3.1.2. Voice output computer terminal
3.1.3. Voice output module
3.1.4. Voice output spelling tutorial device
3.1.5. Voice output word processor
3.2. Deaf and hard of hearing
3.3. Deaf blind
The above table has been constructed based on experience gained by the Avanti project.
Nautilus Web browser1 (a follow-up to the AVANTI Web browser [Stephanidis, 2001]), is an
example of a client application that could be used in PALIO by a wide range of tourists. This
browser is intended for use by a diverse user population (including people with various forms and
degrees of disability), in different contexts of use (e.g., on a user's personal computer, at public
information points).
The interaction metaphors used for the cases of large screen computers are mainly the following
two:
1
Project EPET Nautilus, funded by the Hellenic Ministry of Development.


The widely used “desktop” metaphor based on the windowing environment
The information kiosk metaphor based on simplified interfaces, typical of
multimedia information systems addressing the requirements of
inexperienced users.
In both cases, web browser-like interfaces that are familiar to the average user and
capable of presenting multimedia information content will be employed. Unfortunately, the
possibility exists for the user of a web browser-like interface to overlook some of the
provided features because of assuming no more than the common functionality of a typical
browser. This potential problem needs to be addressed during construction of the interface.
When we are referring to interfaces we mean the overall structure of the page that the
communication layer will transmit to the connected devices. The page will carry its built-in
interface interaction elements i.e.
 Buttons, links
 Menus
 List boxes, Combo boxes
 Check boxes, Options buttons
Typical web-browsing interaction tasks that have to be supported are the following:
 Open a specified location
 Go to previous/next/home page
 Reload document
 Stop loading
 Select a link form the document
 Navigate within the page
 Open/save/print file
 Review and select a page from history of visited sites
 Add/remove/review bookmarks
 Set options regarding the presentation
 Exit
 Help
Nevertheless, we should not forget that these large and static devices discomfort the
visitor as he moves around, naturally exploring the place. However, this way of accessing
information is certainly valuable and complementary to nomadic computing. The tourist will
certainly opt for large screen devices in order to plan future activities from home, to request
information about certain places already visited, to access more detailed information, etc.
1.2.1. Information kiosks
The design of public access systems like information kiosks is characterised by unique features
[Rowley, 1998]. Interaction requirements are determined by the very nature of the device, for
instance:
 The task is generally focused on retrieval
 The context of use requires no intense concentration
 The interaction should not be too time-consuming
 The user group in some cases (not login entrance) is undifferentiated. This is
especially true in our case, where kiosks are supposed to help tourists. Even if
the system knows the profile of the user who logged in, the interface should
respond in a transparent and plain way, as if the user was an inexperienced
one. This is because, the user who logged in and entered the session, most
probably will not view and retrieve alone the information on the screen.
Some kiosks have keypads or/and card readers. However, the most common way of interaction is
via a touch screen. PALIO will support the latter. It is obvious that the content provided by the
AVC centre will be adapted to the requirements of this type of interaction. The information system,
for example, will not demand the user to enter text. Instead, menus and lists of relevant available
options will be presented for selection.
1.3. Interaction through small-screen devices or interactions on the
move
1.3.1. Introduction.
PALIO will support wireless access to its services via WAP-phones, palmtop computers and
laptops. It is assumed that these devices carry a DGPS receiver. The follow discussion concerns
mainly Wap-phones and Palmtops, since these two devices are light and compact enough to be
carried by a traveler.
The novel technology of these devices is not free of problems regarding their performance, usability
etc. As it is pointed out in [C. Johnson, 1998], even if it is possible to identify completely user
requirements and usage contexts for mobile computing devices, it is far from clear whether we have
appropriate devices to satisfy these requirements.
According to [P. Johnson 1998], there are at least four major problems to be faced in developing
HCI of mobile systems:
1. The design of tasks and contexts;
2. Accommodating the diversity and integration of devices, networks and
applications;
3. The lack of HCI model addressing the variability of demands of mobile
systems;
4. Evaluation.
An example of the kind of problems one would face modeling the interactions on the move is the
following situation. How much attention would a tourist pay to the mobile guide while trying to
deal, at the same time, with a child he supervises, or while stepping out onto the street, or as he
interacts collaboratively?
In such a demanding context the forms of interaction would require, for input, forms like pen or
speech, and for output, graphics, text, speech, sounds. It is obvious that the interaction should be
better performed via more than one channels / modalities, since environmental stimuli could engage
one or more of the user’s perceptual modalities.
The interaction with the interface has to be as transparent as possible. The tip that summarizes the
philosophy of designing for mobile devices is “if it needs a manual then it’s too difficult to use “,
[Pascoe 1998].
Experimental studies with small screen devices revealed the following interesting observations [M.
Jones, 1999], that determine interaction requirements accordingly:
 Users reading from the small displays interacted with the window to a much
higher degree than those with the larger window.
 Searching for menu items on a single line display is three times slower than
when a conventional display was used. However, larger display with number
of lines larger than 12, result to no significant effect.
 80% of small screen users began by using the search options of the site
 small screen users selected search facilities twice as many times.
 Large screen users showed a greater tendency to navigate and follow paths
 Small screen users carried out many more scrolling actions

In an exercise test, users given a large screen answered twice as many
questions correctly than users given a small screen .
The above results suggest [M. Jones, 1999] appropriate design actions:
 Provision of search mechanism
 Structuring of information to provide focused navigation
 Placing navigational features (menu, bars etc.) near the top of pages
 Placing key information at the top of pages
 Reducing the amount of information on the page, making the content task
focused rather than verbose
1.3.2. Palmtops
The challenges of small portable computers focus on extending the usability of these devices using
new forms of interaction; methods to overcome screen limitations, ergonomic methods to increase
the ease of use of these devices.
Mobile systems do not respect a considerable number of assumptions implicitly made regarding the
interaction design of applications on fixed-location computers. One serious challenge is the
consideration of issues that arise due to networks delays. A second important point is the need for
new applications inspired by the special nature of the dynamic changing context.
Flip-open pocket computers equipped with miniature keyboards are not suitable for interaction on
the move. Although they provide a convenient solution for static situations where the device can be
placed on a desk, in-hand use of these devices intimates serious usability problems.
The case we study certainly poses such a constraint. Most of the times tourists will not be able to
find a desk to type peacefully. A solution to this requirement could be the use of a pen in a touchsensitive pad. Pen-based interfaces on a pad-like surface provide a more ergonomic, although
slower, interaction.
The TFT colour display screen that comes with some PDA’s is not easily readable under direct
sunlight. This causes a serious constraint for the interface of a tourist guide application. PALIO will
overcome this problem using black and white graphics or colours with high contrasts.
1.3.3. Mobile phones
Interaction problems arise due to the size of the display, which makes any information retrieval
harder, and due to the 12-key keypads not suitable for touch-typing. If someone enlarges the size,
usability is improved with respect to these two factors (display, keypad) but portability is reduced
(weight-size).
Users always prefer the traditional Qwerty layout, where each character is mapped
onto a key in a 1:1 fashion. Ways out of the text entry problem are :
2

Special software2 with an internal database, which scans various
combinations to resolve the indented word, [Dunlop, 1999].

“Qwerty keyboard without a board” interface. It uses electromyography
sensors placed on a forearm and reads muscle motions. [M.Goldstein, 1998]

Chording gloves, Finger-Joint Gesture wearable glove, [M. Goldstein, 1998].

Speech recognition. The technology is improving [Graham 1999]. It is not
appropriate when privacy issues in public places are of concern.
An example of such a software can be found in www..tegic.com
Nokia 9000 provides a miniature version of a full qwerty keyboard. However, usability suffers
regarding both effectiveness and efficiency of the device [Goldstein 1998]. Users experience
difficulties to use touch-typing (using both hands and all fingers when typing) when using
miniaturised keyboards.
The other major interaction problem we have already mentioned is the lack of screen
space. It is a fact that, once the device is put in the user’s ear to make or receive a call, the
screen is unusable, thus reducing the effectiveness of user-screen interaction. The latter
problem can be handled with hands-free speakers.
A way out of the limited size screen problem is the use of:

Non-speech sound. The idea of Earcons can be used to enhance interaction or
sonically enhanced widgets [Brewster, 1998].

Creating acoustic windows in a three dimensional acoustic space [Walker,
1999]
Special concern will be paid in order to minimize the above two problems of cellular phones in the
design of PALIO. The system will avoid prompting the user of WAP phones for text entry,
employing this way of input only when absolutely necessary.
The WML page will carry information content adapted by the AVC centre, in order to meet the
display constraints of these devices. Metaphors appropriate for the kind of interactions we have to
support are:

Deck / card organisational metaphor which is used in WML [Herstad H.
1998]

The stick-e metaphor [Pascoe, 1997], which presents notes as being attached
to a context.
Short Message Service (SMS) is a wireless service available on mobile networks that
enables the transmission of text messages between mobile phones and other systems. The
service currently handles more than two billion messages per month in the European market
alone, and is considered as a very friendly and easy way of indirect communication.
PALIO System is planning to provide Short Message Service to its users. The purpose of the
service is to enable tourists to get informed about cultural events, interesting places (sightseeings,
accommodation, entertainment, tourist related SMEs) around them, airline schedules, weather
conditions etc.
Messages will be sent to the users each time that they will pass through a specific area (an area for
which information to be delivered through SMS is available). Additionally mass messages
regarding general information can be sent.
The only requirement is user to enable PALIO SMS Service giving to the system his mobile phone
number. Of course user’s mobile device and the mobile network has to support positioning
capabilities.
The following figure represents the SMS Service for PALIO:
Application
Send Messages
HTTP POST/GET
(Adhering to SMSc Protocol)
SMSc
(GSM Operator)
The Send Messages Application receives from the PALIO Database the text message that has to be
sent (it must not exceed 160 string characters) and the phone number of the recipient. Using
protocols like HTTP those data are transforming in such a way as to adhere to SMSc protocol.
When the formatted message comes to SMSc the GSM Operator undertake to send the message to
user’s mobile phone through its repeaters and the base stations.
1.4. Voice-based interaction
It would be nice to address, in PALIO, mutimodal interaction with one of the following speech
modes supported
 Speech recognition
 Speech synthesis
 Pre-recorded speech
Interaction requirements for the above case, in the context of voice mark-up languages, have been
specified by the W3C organisation. Some of the important requirements are briefly mentioned
below:
 Scalability across end-user devices. The mark-up language used will be
scalable across devices with a range of capabilities.
 Seamless synchronization of the various modalities
 Sequential multi-modal input. Speech and user input from other modalities is
to be interpreted separately.
 Uncoordinated, Simultaneous, Multi-modal Input. Input modalities are
simultaneously active and there is no requirement that interpretation of the
input modalities should be coordinated (i.e. interpreted together). In a
particular dialog state only input from one of the modalities is interpreted.
Example. In a tourist application accessed by phone, the browser renders the
speech “Enter you name”, the user says “John Brown” or types his name, the
browser renders the speech “Enter your login number” ….etc.
 Support for conflicting input from different modalities.
 Time stamping. All input and output events will be time stamped.
 Sequential multimedia output
 Uncoordinated, simultaneous, Multimedia Output
 Synchronization of multimedia with voice input
 Detect that a given output is available
1.5. Accessing a service through multiple modalities / devices
Multi-modal interaction gives the user the ability to employ different modalities such as gesture,
voice and typing for communication and interaction with the computer. Processing output and/or
input information concurrently enables parallel utilisation of the various communications channels.
In person-to-person interaction a variety of channels are used. This considerably enhances
communication and helps the evolvement of multiple cognitive levels.
The redundancy of the provided information could also help interface robustness, since failure of
one channel may be recovered by another channel [A.Takeuchi, 1993]. Possible perception problems
(permanent or temporary) of the user, regarding a specific channel, could be overcome through the
realisation of alternative modalities, accessible by the user.
The AVC system will provide HTML and WML pages with content description, in order to support
client applications with multi-modal interactions.
1.6. Device-adapted interaction
The AVC centre will adapt the structure of the document according to the type of device the user
controls in order to interact. It was decided that adaptations regarding the general "display"
characteristics of the device employed by the user should be carried out before the system response
reaches the Communication Layer. For instance, adaptation logic for small screen displays would
be, for the most part, independent of whether the final document will be transformed into HTML or
WML. Additionally, support of new protocols or mark-up languages at the level of the
Communication Layer is greatly simplified.
For the case of a WAP enabled cellular phone, adaptations have to be made due to :
 Low- memory, Low Cache
 Limited processing power
Limited browser context size. If the author of a WML page has used a large number of variables, in
the browser context, this will cause memory exhaustion [Herstad, 1998
Download