05-830 Advanced User Interface Software Brad Myers Human Computer Interaction Institute

advertisement
05-830
Advanced User Interface Software
Brad Myers
Human Computer Interaction Institute
Spring, 2009
1
Course:

Course web page:
http://www.cs.cmu.edu/~bam/uicourse/830spring09/

Schedule:
http://www.cs.cmu.edu/~bam/uicourse/830spring09/schedule.html

Wednesday and Fridays



Various conferences and other conflicts
1:30pm to 2:50pm
Room: Usually:



Wednesday's: NSH 3501
Friday's: NSH 2507
Except Wed, Jan 28th: NSH 1507
2
Instructor

Brad Myers







Secretary: Brandy Renduels,



Human Computer Interaction Institute
Office: Newell-Simon Hall (NSH) 3517
Phone: x8-5150
E-mail: bam@cs.cmu.edu
http://www.cs.cmu.edu/~bam
Office hours: By appointment.
NSH 3526A
x8-7099
No TA
3
Readings and Homeworks

Schedule of readings:
http://www.cs.cmu.edu/~bam/uicourse/830spring09/schedule.html




Course schedule is tentative
Note required readings
Student-presented material at end
CMU-only, use CMU network or VPN
 Homeworks
http://www.cs.cmu.edu/~bam/uicourse/830spring09/homeworks.html
 No midterm or final
 Create a framework for UI software
 Like Amulet / Garnet / SubArctic / Flex2
 No project
 Harder in the middle
4
What is this class about?

“User Interface Software”




All the software that implements the user interface
“User Interface” = The part of an application that a person
(user) can see or interact with (look + feel)
Often distinguished from the “functionality” (back-end)
implementation
“Implements” – course will cover how to code a design once
you already have the design
 Not covering the design process or UI evaluations


(Except that we will cover design & prototyping tools, & eval. of tools)
User Interface Software Tools

Ways to help programmers create user interface software
5
Examples of UI Software Tools

Names


APIs for UI development:













JavaScript, php language, html, …
Adobe’s ActionScript (for Flash)
2-D and 3-D graphics models for UIs
Research systems:


Visual Basic .Net
Adobe Flash
Programming Languages focused on UI development


Microsoft Foundation Classes, .Net, wx-Python
Java Swing
Apple Cocoa, Carbon
Eclipse SWT
Interactive tools


Toolkits, Development Kits, SDKs, APIs, Libraries, Interface Builders, Prototypers, Frameworks,
UIMS, UIDE, …
Garnet
Amulet
subArctic
the Context Toolkit
Papier Mache
Internet UI frameworks
Service-Oriented Architecture (SOA) and other component frameworks
6
What Will We Cover?

History of User Interface Software Tools




Future of UI Software Tools




What has been tried
What worked and didn’t
Where the currently-popular techniques came from
What is being investigated?
What are the current approaches
What are the challenges
How to evaluate tools

Good or bad
7
Homework 1

http://www.cs.cmu.edu/~bam/uicourse/830spring09/homework_1.html

Assign tools to students


Spreadsheet
Evaluate using HE, Cognitive Dimensions, or
user testing
8
Lecture 1:
Evaluating Tools
Brad Myers
9
How Can UI Tools be Evaluated?


Same as any other software
Software Engineering Quality Metrics


Power (expressiveness, extensibility and evolvability),
Performance (speed, memory), Robustness, Complexity,
Defects (bugginess), …
Same as other GUIs

Tool users (programmers) are people too
 Effectiveness
 Errors
 Satisfaction
 Learnability
 Memorability
 …
10
Stakeholders




Who cares about UI Tools’ quality?
Tool Designers
Tool Users (programmers)
Users of Products created with
the tools = consumers
11
API Design Decisions

(Stylos, 2007)
12
API Design Decisions, cont.
13
UI Evaluation of UI Software Tools:
Some Usability Methods













Heuristic Evaluation
Cognitive Dimensions
Think-aloud user studies
Personas
Contextual Inquiry
Contextual Design
Paper prototypes
Cognitive Walkthrough
KLM and GOMS
Task analysis
Questionnaires
Surveys
Interaction Relabeling













Focus groups
Video prototyping
Wizard of Oz
Body storming
Affinity diagrams
Expert interviews
Card sorting
Diary studies
Improvisation
Use cases
Scenarios
Log analysis
…
14
Design and Development



Use CIs and other field studies to find problems to solve
 Ko, A.J., Myers, B.A., and Aung, H.H. “Six Learning Barriers in
End-User Programming Systems,” in IEEE VL/HCC’2004. pp.
199-206.
 Ko, A.J. and DeLine, R. “A Field Study of Information Needs in
Collocated Software Development Teams,” in ICSE'2007.
 Also surveys, etc.: Myers, B., Park, S.Y., Nakano, Y., Mueller, G.,
and Ko, A. “How Designers Design and Program Interactive
Behaviors,” in IEEE VL/HCC‘2008. pp. 185-188.
Iterative design and usability testing of versions
 E.g., in the development of Alice
Summative testing at end
15
Heuristic Evaluation Method
Named by Jakob Nielsen
 Expert evaluates the user interface using
guidelines
 “Discount” usability engineering method

 One

case study found factor of 48 in cost/benefit:
Cost of inspection: $10,500. Benefit: $500,000
(Nielsen, 1994)
16
10 Basic Principles
From Nielsen’s web page:
http://www.useit.com/papers/heuristic/heuristic_list.html
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Visibility of system status
Match between system and the real world
User control and freedom
Consistency and standards
Error prevention
Recognition rather than recall
Flexibility and efficiency of use
Aesthetic and minimalist design
Help users recognize, diagnose, and recover from errors
Help and Documentation

Slightly different from list in Nielsen’s text
17
Cognitive Dimensions

12 different dimensions (or factors) that individually and collectively have an impact
on the way that developers work with an API and on the way that developers expect
the API to work. (from Clarke’04)
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
Abstraction level. The minimum and maximum levels of abstraction exposed by the API
Learning style. The learning requirements posed by the API, and the learning styles
available to a targeted developer.
Working framework. The size of the conceptual chunk (developer working set) needed
to work effectively.
Work-step unit. How much of a programming task must/can be completed in a single
step.
Progressive evaluation. To what extent partially completed code can be executed to
obtain feedback on code behavior.
Premature commitment. The amount of decisions that developers have to make when
writing code for a given scenario and the consequences of those decisions.
Penetrability. How the API facilitates exploration, analysis, and understanding of its
components, and how targeted developers go about retrieving what is needed.
Elaboration. The extent to which the API must be adapted to meet the needs of targeted
developers.
Viscosity. The barriers to change inherent in the API, and how much effort a targeted
developer needs to expend to make a change.
Consistency. How much of the rest of an API can be inferred once part of it is learned.
Role expressiveness. How apparent the relationship is between each component
exposed by an API and the program as a whole.
Domain correspondence. How clearly the API components map to the domain and any
18
special tricks that the developer needs to be aware of to accomplish some functionality.
User Interface Testing of Tools



Use Think-aloud user studies, or similar
A vs. B or just UI improvements of A
Issues:

Vast differences in programmer productivity
 10X often cited, e.g:
http://blogs.construx.com/blogs/stevemcc/archive/2008/03/27/productivity-variationsamong-software-developers-and-teams-the-origin-of-quot-10x-quot.aspx




Sackman, 1968, Curtis 1981, Mills 1983, DeMarco and Lister 1985,
Curtis et al. 1986, Card 1987, Boehm and Papaccio 1988, Valett and
McGarry 1989, Boehm et al 2000
Difficulty of controlling for prior knowledge
Task design for users to do
Usually really care about expert performance, which is
difficult to measure in a user test
19
Examples of UI Tests

Many recent tool papers have user tests

Especially at CHI conference

E.g.: Ellis, J. B., Wahid, S., Danis, C., and Kellogg, W. A. 2007. Task and
social visualization in software development: evaluation of a prototype. CHI
'07. http://doi.acm.org/10.1145/1240624.1240716


8 participants, 3 tasks, within subjects: Bugzilla vs. SHO, observational
Backlash? at UIST conference


Olsen, 2007: “Evaluating user interface systems research”
But: Hartmann, Björn,Loren Yu, Abel Allison, Yeonsoo Yang, and Scott
Klemmer. "Design As Exploration: Creating Interface Alternatives through
Parallel Authoring and Runtime Tuning“, UIST 2008 Full Paper –
Best Student Paper Award

18 participants, within subjects,
full interface vs. features removed,
“(one-tailed, paired Student’s
t-test; p < 0.01)”
20
Steven Clarke’s “Personas”





Classified types of programmers he felt were relevant to UI tests of Microsoft
products (Clarke, 2004) (Stylos & Clarke 2007)
Capture different work styles, not experience or proficiency
Systematic - work from the top down, attempting to understand the system as a
whole before focusing on an individual component. Program defensively, making
few assumptions about code or APIs and mistrusting even the guarantees an
API makes, preferring to do additional testing in their own environment. Prefer
full control, as in C, C++
Opportunistic - work from the bottom up on their current task and do not want
to worry about the low-level details. Want to get their code working and quickly
as possible without having to understand any more of the underlying APIs than
they have to. They are the most common persona and prefer simple and easy to
use languages that offer high levels of productivity at the expense of control,
such as Visual Basic.
Pragmatic - less defensive and learn as they go, starting working from the
bottom up with a specific task. However when this approach fails they revert to
the top-down approach used by systematic programmers. Willing to trade off
control for simplicity but prefer to be aware of and in control of this trade off.
Prefer Java and C#.
21
Usability Testing of APIs


PhD work of Jeff Stylos (extending Steven Clarke’s
work)
Which programming patterns are most usable?







Default constructors
Factory pattern
Object design
E-SOA APIs
Measures: learnability, errors, preferences
Expert and novice programmers
Fix by:



Changing APIs
Changing documentation
Better tools in IDEs

E.g., use of Code completion
(“IntelliSence”) for exploration
22
Required Constructors


(Stylos & Clarke 2007)
Compared create-set-call (default constructor)
var foo = new FooClass();
foo.Bar = barValue;
foo.Use();

vs. required constructors:
var foo = new FooClass(barValue);
foo.Use();




All participants assumed there would be a default
constructor
Required constructors interfered with learning
 Want to experiment with what kind of object to use first
Did not insure valid objects – passed in null
Preferred to not use temporary variables
23
“Factory” Pattern



(Ellis, Stylos, Myers 2007)
Instead of “normal” creation: Widget w = new
Objects must be created by another class:
Widget();
AbstractFactory f = AbstractFactory.getDefault();
Widget w = f.createWidget();




Used frequently in Java (>61) and .Net (>13) and SAP
Lab study with expert Java programmers
 Five programming and debugging tasks
 Within subject and between subject measures
Results:
 When asked to design on “blank paper”, no one designed a factory
 Time to develop using factories took 2.1 to 5.3 times longer compared
to regular constructors (20:05 v 9:31, 7:10 v 1:20)
 All subjects had difficulties getting using factories in APIs
Implications: avoid the factory pattern!
24
Object Method Placement



(Stylos & Myers, 2008)
Where to put functions when doing object-oriented design of
APIs
 mail_Server.send( mail_Message )
vs.
mail_Message.send( mail_Server )
When desired method is on the class that they start with, users
were between 2.4 and 11.2 times faster (p < 0.05)
Starting class can be predicted based on user’s tasks
Time to Find a Method
20
Time (min)

15
Methods on
Expected Objects
10
Methods on
Helper Objects
5
25
0
Email Task
Web Task
Thingies Task
Summary



CIs and Iterative Design to help design and
develop better tools
User testing is still the “gold standard” for
user interface tools
HE and CD are useful for evaluations
26
Download