CS 3724 Introduction to HCI Dr. Scott McCrickard McBryde 623 mccricks@cs.vt.edu Who are these people? • Dr. McCrickard – second year assistant professor in CS – research interests include HCI, dual task systems, information visualization, user interfaces • Peter Schoenhoff – graduate student in computer science Textbooks • Ben Shneiderman, Designing the User Interface (DTUI) • Don Norman, The Design of Everyday Things (DOET) Other Resources • Email is the best way to contact both Dr. McCrickard (mccricks@cs.vt.edu) and Pete (pschoen@cs.vt.edu) • The newsgroup (vatech.class.cs3724) is best for questions and comments • Web page (courses.cs.vt.edu/~cs3724) contains lecture outlines, assignments, and related materials Evaluation • • • • • Group project (35%) Homeworks (20%) Midterm (15%) Final (20%) Class participation (10%) What is HCI? • The Human – Single user, groups, I/O channels, memory, reasoning, problem solving, error, psychology • The Computer – Desktop, embedded system, data entry devices, output devices, memory, processing • The Interaction – Direct/indirect communication, models, frameworks, styles, ergonomics HCI at VT • • • • Scott McCrickard Doug Bowman Chris North Manuel Perez • • • • John Carroll Mary Beth Rosson Rex Hartson Others in CS, ISE, etc An Aside: VTURCS VTURCS = Virginia Tech Undergraduate Research in Computer Science • Work with professors on ongoing research projects. • Receive travel money to attend conferences. • Present your work at annual symposium. Attend the Project Fair in mid-fall for details (see http://vturcs.cs.vt.edu for details) History of HCI • Vannevar Bush, 1945 “As We May Think” • Vision of post-war activities, Memex • “…when one of these items is in view, the other can be instantly recalled merely by tapping a button” History of HCI (con’d) • JCR Licklider, 1960 Computer Symbiosis” “Man- • Tightly coupled human brain and machine, speech recognition, time sharing, character recognition History of HCI (con’d) • Douglas Engelbart, 1962 “Augmenting Human Intellect: A Conceptual Framework” • In 1968, workstation with a mouse, links across documents, chorded keyboard History of HCI (con’d) • XEROX Alto and Star – – – – – Windows Menus Scrollbars Pointing Consistency • Apple LISA and Mac – Inexpensive – High-quality graphics – 3rd party applications History (and future) of HCI • • • • • Large displays Small displays Peripheral displays Alternative I/O Ubiquitous computing • Virtual environments • Implants • • • • • • Speech recognition Multimedia Video conferencing Artificial intelligence Software agents Recommender systems • ... Goals of UI Design • 5 (plus 1) measurable factors – Time to learn – Speed of performance – Error rate – Retention over time – Subjective satisfaction – Cost Motivations • Life-critical – Low error rate first and foremost • Industrial/commercial – Speed – Error rates • Office/home – Ease of learning – High user satisfaction – Low cost • Exploratory – Free! Approach • Super-reduced scientific method – Understand problem – Manipulate a few variables – Measure performance – Interpret results …then repeat, repeat, repeat. Change Computer Consciousness • Remove threat/fear/intelligence from computer • Be angry when something goes wrong – Ambiguous “syntax error” is received – Push a pull door • Criticize the design, consider how to fix it The H, The C, and The I A Test That Picture Game What do humans do well? • • • • • • Sense low level stimuli Pattern recognition Inductive reasoning Multiple strategies Adapting “Hard and fuzzy things”, paraphrasing George Miller Another Test Not That Picture Game What do computers do well? • • • • • • Counting and measuring Accurate storage and recall Rapid and consistent responses Data processing/calculation Repetitive actions “Simple and sharply defined things” again paraphrasing George Miller The Interaction At a high level, let humans do what humans do well and let computers do what computers do well The Interaction • Let humans do: – Sensing of low level stimuli – Pattern recognition – Inductive reasoning – Multiple strategies – Adapting – Creating • Let computers do: – Counting and measuring – Accurate storage and recall – Rapid and consistent responses – Data processing – Calculation – Repetitive actions Affordances • “Perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used” -- DOET • Chair affords sitting • Glass affords seeing thru (and breaking) • Wood affords support (and carving) Poor Affordances Window Affordances • Rolling down the window: – Up? – Down? – Automatic? Wiper Affordances • Turning on the windshield wipers – Front wipers? – Rear wipers? – Speed controls? Trunk Affordances • Protective cover seems useful, usable • Is there a downside? Know Thy User Affordances in User Interfaces Principles of Design • Provide a good conceptual model – How does it work? – What does it say to the user? • Make things visible – What can user see/feel/grab/push? – What does it look like it will do? Human Limitations and Error • Theoretical foundations of human performance • Expectations and attitudes • User productivity • Slips vs mistakes • Gulf of execution vs gulf of evaluation • Read DTUI ch 10, DOET ch 5 Limitations of Short-Term Memory • Miller’s 7 +/- 2 magic number – People can recognize 7 +/- 2 chunks of information at a time and hold these chunks in memory for 15-30 seconds • Chunking – ability to cluster information together – size of chunk depends on knowledge, experience, and familiarity Chunking Example 1 HEC ATR ANU PTH ETR EET Chunking Example 2 THE CAT RAN UP THE TREE Chunking Example (con’d) HEC ATR ANU PTH ETR EET THE CAT RAN UP THE TREE Types of Errors • Slips – Subconscious actions intended to satisfy our goals get waylaid by automatic behavior – Proper goal, improper procedure • Mistakes – Conscious deliberation leads to incorrect generalizations or classifications – Improperly formed goal Slips • • • • • • Capture errors Description errors Data-driven errors Associative activation errors Loss-of-activation errors Mode errors Mistakes • Mis-classifying situations – Ex: assuming everyone thinks like you do • Failing to take all relevant factors into account – Ex: ignoring the weather • Making poor decisions – All information accounted for, decision poor Designing for Error • Design to minimize causes for error • Make it possible to undo actions • Make it difficult to perform actions that cannot be reversed • Make it easier to discover errors • Change the attitude toward errors Forcing Functions • Interlocks – Forcing operations to take place in sequence – Ex: microwave ovens, safety on gun • Lockins – Keeps operation active, preventing stoppage – Ex: soft switches on computers • Lockouts – Prevents events from occurring – Ex: no simple passage to basement Conditions for Optimum Problem Solving • • • • • Knowledge of objects and actions Feedback about progress to solution Minimal delays Few errors that are easy to handle Low anxiety Expectations and Attitudes • Miller’s 2-second limit • Task response times – typing/mouse: 50-150 milliseconds – simple frequent tasks: 1 second – common tasks: 2-4 seconds – complex tasks: 8-12 seconds Response Time Variability • People prefer shorter response times – But are they more productive? • Shorter response times lead to shorter think times – Compile times now and then – Web page load times • Response time choke Guidelines • Users prefer shorter response times • Shorter response times lead to less thinking • Longer times (>15 sec) are disruptive • Faster times may increase productivity, but also error rates • Users should be advised of long delays • Modest variability in response time is acceptable • Unexpected delays may be disruptive • Empirical tests can help set response times Models of Interaction Foley & van Dam’s Four Level Approach • Four top-down level – Conceptual level – Semantic level – Syntactic level – Lexical level • Top-down nature modular, easy to explain, matches software architecture Card, Moran, & Newell’s KLM and GOMS • Keystroke-level model – Task acquisition: user builds mental representation – Task execution: uses system facilities – Decompose execution phase into motor and mental operators • GOMS model – – – – Goals Operators Methods Selection rules • CCT, NGOMSL, TAG, others elaborated on this model GOMS Example Don Norman’s Stages of Action models • Gulf of execution and gulf of evaluation • Stages of execution and evaluation • Seven stages of action Gulfs of Execution and Evaluation • Gulf of execution – Difference/mismatch between user’s intentions and allowable actions • Gulf of evaluation – Amount of effort to interpret the system state and determine how well expectations and intentions have been met Stages of Execution and Evaluation • Execution – An intention to act so as to achieve a goal – The actual sequence of actions that we plan to do – The physical execution of that sequence of actions • Evaluation – Perceiving the state of the world – Interpreting the perception according to our expectations – Evaluation of the interpretations with what we expected to happen Seven Stages of Action Shneiderman’s 8 Golden Rules • Strive for consistency • Enable frequent users to use shortcuts • Offer informative feedback • Design dialogs to yield closure • Offer error prevention • Permit easy reversal of actions • Support internal locus of control • Reduce short-term memory load Introduction to Design and Usability • Thus far, we have considered – Strengths and limitations of the human – Characteristics of the computer – Guidelines for interaction • Next questions: – How do we design and evaluate interfaces? – Why are there so many poorly designed products? – How can we evaluate and improve products? The Evolution of Design • Design is evolutionary, not revolutionary – Few designs are right the first time – Test, modify, retest • Carroll and Rosson design characterization – Design – Design – Design – Design is a process, not a state process is nonhierarchical process is radically transformational involves discovery of new goals Evolutionary Design Examples • Telephone – Crank, handset, microphone – Durability – Button size, spacing, and feedback – Voice feedback – Are new features evolutionary? • Typewriter – Circular, piano keyboards – Alphabetical keyboard – QWERTY keyboard – Shift/tab – Dvorak keyboard – Chord keyboard – Keyboards for VR, wearable computers Evolution of the PDA • How not to do it: Apple Newton – Do-it-all product does lots of things poorly – Unfocused market -- who wants a $700 personal organizer – Smaller than previous PDAs but still too large for a pocket – Poor use of undeveloped technologies: handwriting recognition, word completion The Doonesbury Effect A Successful Design: Palm • Evolution through rapid prototyping – Developer “used” wood block as a PDA – Each meeting centered around a prototype • Well-targeted audience – Four basic applications – Inexpensive – Data synchronization led to multiplatform Why Designers Go Astray • “It probably won a prize” -- putting aesthetics first • Designers are not typical users – Self-designed Web pages • Clients are often not users – Apartment managers – State contracts Design Temptations • Creeping featurism – Word processors – PDAs – Computers • Worshipping of false images – Audio/video equipment – Computers How to Do Things • The wrong way – – – – Make things invisible Be arbitrary Be inconsistent Make operations unintelligible – Be impolite – Make operations dangerous • The right way – Use knowledge in the world and knowledge in the head – Simplify structure – Make things visible (invite exploration) – Provide mappings – Exploit constraints – Design for error Three Pillars of Design • Guidelines documents – Words/icons, screen layout, I/O, action sequences, training • User interface software tools – Hypercard, MacroMind Director – Visual Basic. Delpi, Java, Tcl/Tk • Expert reviews and usability testing – Pilot tests, surveys, analysis, metrics Development Methodologies • Business-oriented approaches to software development • Why use them? – Many (most?) software development projects fail to achieve their goals – Need to enhance developer/user relationship • Academicians bridged the way – Hix and Hartson 1993; Nielsen 1993 LUCID • Logical User-Centered Interactive Design Methodology (Kreitzberg 1996) • Six stages: • Develop product concept • Perform research and needs analysis • Design concepts and key-screen prototypes • Do iterative design and refinement • Implement software • Provide rollout support Ethnographic Observation • Active participation in workers’ daily lives for an extended period of time • Steps for ethnographers – Preparation (understand policies and culture) – Field study (establish rapport, observe users) – Analysis (compile, reduce, interpret data) – Reporting (prepare report for audiences) Participatory Design • Pros – More accurate info about tasks – Opportunity for users to influence design decisions – Increased user acceptance • Cons – Very costly – Lengthens implementation time – Builds antagonism with users whose ideas are rejected – Force designers to compromise designs Scenario Development • Study range and distribution of task frequencies and sequences • Data about current performance should be collected to provide a baseline • Generate scenarios of usage and then act them out Legal and Ethical Issues • Social impact statement – Describe system and benefits – Address concerns and barriers – Outline development process • Privacy and (vs?) security • Safety/reliability • Copyrights and patents The IRB at VT • All human-subject experiments must be cleared by the Institutional Review Board (IRB) • Must describe experiment, outline potential harm to participants, provide consent form • Most CS experiments cleared quickly Introduction to Usability • Why do usability testing? – Too easy to become entranced with own design – Design methods rarely account for all situations – Usability can help gain competitive edge – Legal impacts Metrics • Customer or manager sets measurable goals (metrics) for performance – Time for users to learn specific functions – Speed of task performance – Rate of errors by users • Acceptance tests used to verify that metrics have been met Web Guidelines and Metrics • Jennifer Fleming (Web Navigation: Designing the User Experience) – Be easily learned – Remain consistent – Provide feedback – Provide clear visual messages – Support users’ goals and behaviors • Jacob Nielsen (www.alertbox.com) – Avoid frame usage – Avoid long pages and scrolling – Use standard link colors – Design for scanning, not reading HW2: Reverse Engineering Web Metrics • Can we establish Web metrics by analyzing “good” and “bad” sites? • Traffic-based analysis (counters) and time-based analysis (paths) common • What about quantitative measures for page composition, layout, size, etc? • Can be used as design guidelines or in evaluating new sites Types of Tests • • • • Expert reviews Lab-based usability testing Surveys Evaluations during use (interviews, focus groups, logging, error reporting) • Acceptance tests (metrics and guidelines) Usability Testing • Typical excuse: nice idea, but time and resources are limited • Universities: Virginia Tech, CMU, UCSD, Colorado, Michigan • Industry: IBM, Microsoft • Consulting companies: Aaron Marcus, Nielsen-Norman VT Usability Lab • Partitioned into work, evaluation areas • Two-way mirrors allow observation • Machines support logging and screen capturing • Video recording and editing equipment Stages of Usability Testing • Schedule facility • Develop detailed test plan – List of tasks, questions, pilot studies • Choose participants – Concerns include background, education, motivation, physical issues • Collect and analyze data, generate reports Observational Techniques • Think-aloud – User describes what they want to do, why – Simple, useful insight – Can affect timings • Post-task walkthrough – Discuss alternatives not pursued – Reflect back on actions • Protocol analysis – – – – – – Paper and pencil Audio recording Video recording Computer logging User notebooks Automatic protocol analysis tools Variants of Usability Testing • Nielsen’s discount usability engineering – Lowers time and money barriers • Field testing – Logging, portable usability labs, beta tests • Can-you-break-this approach • Competitive usability testing – Parallel tasks on competing interfaces Drawbacks of Usability Testing • Emphasizes first-time usage – Tests are at most 2-4 hours – Hard to translate into weeks, months • Limited coverage of interface features – Difficult to adequately test all features in short time • Setting often unrealistic Questionnaires • General – Establish background, gender, experience – Usually not opinions • Open-ended – Unprompted opinion on a question – Often answers too brief, hard to summarize • Scalar – Judge statement on numeric scale – Granularity usually 5 or 7 to differentiate yet maintain clarity • Multi-choice – Choose one or more • Ranked – Place in order QUIS • Questionnaire for User Interface Satisfaction (QUIS) • Covers interface details (readability of characters, meaning of icons, shortcuts) • Used for comparing programming environments, applications, Web sites • Available free for academia, licensed to industry, operators are standing by HW2: Reverse Engineering Web Metrics • Due Oct 11 (NOTE DATE CHANGE) • Can we establish Web metrics by analyzing “good” and “bad” sites? • What about quantitative measures for page composition, layout, size, etc? • Can be used as design guidelines or in evaluating new sites CS 3724 Group Project • Part 0 due Oct 4 (see Web page) • Analyze, design, and evaluate an interface • Reuse ongoing or previous projects from other classes or jobs • Prepare and maintain a project workbook with links to reports Midterm • Oct 2 in class • Closed book, closed notes • Some timeline, multiple choice questions • Mostly short answer, essay • No class Oct 4 Early Interaction Paradigms • Time-sharing – Licklider, ARPA • Video display units – Sutherland, MIT -- Sketchpad, visual displays • Programming toolkits – Engelbart -- carpenter metaphor • Personal computing – Seymour Papert -- Logo, Alan Kay -- Smalltalk Interaction Paradigms • Windows and WIMPs – Engelbart, Xerox ALTO and Star • Metaphors – Increase initial familiarity (desktop, desk) – Inadequacy in promoting or supporting full understanding (disk to trash) • Direct manipulation (DM) Characteristics of DM Systems • Replacement of common-language syntax by direct manipulation of the objects of interest • Continuous visibility of objects and actions of interest • Rapid, reversible, incremental actions Early DM Systems • VisiCalc – instantly calculating electronic spreadsheet – “the software tail that wags (and sells) the PC dog”, Ben Rosen • Lotus 1-2-3, Microsoft Excel – graphics, 3D, multiple windows, databases DM and Games • PONG • Space Invaders, Missile Command, Centipede • Maze games: PacMan, Donkey Kong • 3D games: Doom, Duke Nukem, Myst Features of Games • • • • • • • Internal locus of control Continuous feedback (score) Entertainment and challenge of mastery Focus on task, limited distractions New I/O devices User interface advances Advanced graphical techniques DM and Text Editing • Line-oriented editors – Work with a single line at a time – Example: ed • Full-page display editors – View full screen of text, edit directly – Examples: vi, emacs • Point-and-click editors – Use mouse for selection, moving – Example: xemacs WYSIWYG • • • • • • What You See Is What You Get Document printed in on-screen format Cursor action visible, controls obvious Labeled icons for actions Results of actions displayed immediately Actions are easily reversible Advantages of DM • • • • • • Easy to learn and remember Direct WYSIWYG Flexible, easily reversible actions Context and visual feedback Exploits use of visual spatial cues Limits types of errors Drawbacks of DM • What you see is all you get • Wastes screen space • Difficult to convey some meanings • Visual representations can be misleading • Mouse can be slow • Not self-explanatory • Not good at – Repetition – History storage and display – Certain tasks with no manual equivalent – Macro creation Alternatives to DM • Hypertext and the World Wide Web – Vannevar Bush and Memex • CSCW and groupware • Agent-based interfaces – Eager cat, Microsoft Clip-it • Ubiquitous computing – Mark Weiser and the dangling string