doc - CSE Home

advertisement
Alex Odle, Rylan Hawkins, Mazen Hassan, Joe Woo
April 20, 2008
CSE 490L
swickr.blogspot.com
http://www.cs.washington.edu/orgs/acm/cse490/Swickr_LowFiDocument.doc
Swickr – Remote Ordering
I. Introduction
Standing in line and waiting for an order at a local coffee shop or quick serve restaurant adds to the
many inconveniences of daily life. A significant amount of time and money is being wasted with these
inefficient processes. With the advent of Swickr, a mobile website, we hope streamline the order
process so that consumers can order and pay for their order on their way to the store. That way,
customers’ orders will be ready by the time they get there. The goal of this experiment is to determine
the usability of our UI and ultimately reduce the transaction costs and automate the process of paying
for and receiving food.
II. Prototype
The prototype consisted of a mobile version of an e-commerce site with some alterations and
simplifications to match the interface, goals and project definitions. It followed many design patterns of
e-commerce sites with some changes to match the needs of the system. It began with a home page with
an upfront value proposition or a personal home page with favorited orders. Both pages allowed for
searching of local stores within the area.
Figure 2b. Logged in user
Figure 2a. Guest Login
Link to larger image of guest login
Link to larger image of logged in user
From there the users could process through a map to find the local store whereupon they would begin
to navigate a menu.
Figure 2c. Map View
Figure 2d. Menu
Link to larger image of map view
Link to larger image of menu
Figure 2e. Types of Frappuccinos
Link to larger image of types of Frappuccinos
After navigating through a menu or after choosing a particular favorite order the user would then be
able to view their cart, customize their order and then place the order. Users will be able to customize
each item by clicking the ‘customize’ button which will expand an accordion control (see Figure 2i
below).
Figure 2f. Cart
Link to larger image of cart
Figure 2g. Cart after user customization
Link to larger image of customized cart
Figure 2h. Confirmation page
Figure 2i. Item customization
Link to larger image of confirmation page
Link to larger image of item customization
Figure 2j. Entire paper-based system
Link to larger images of entire system 1 of 4
Link to larger images of entire system 2 of 4
Link to larger images of entire system 3 of 4
Link to larger images of entire system 4 of 4
The prototype was primarily built for an iPhone web interface and utilized many practices common to
iPhone interfaces, such as referenced back buttons, large touch interface buttons and top navigation
bars. The above prototype images show the basic sequence of pages, all images of the pages are given in
the appendix.
III. Method
The following section will cover the overall method of our paper prototyping usability study. Specifically,
what types of participants we used, the environment the study was conducted in, the tasks we asked
our participants to complete, the procedure we used, and our test measures of success.
Participants
Our participants were selected pseudo randomly from pool of about 40 college students living in off
campus housing. Our method of selection was a combination of simply asking the first people that
crossed our path and being slightly selective in an attempt to get a diverse group of technical savvy. As a
result we were able to recruit 4 participants, three males and one female between the ages of 19 and
22. The self-described technical savvy of the participants ranged from somewhat to extremely tech
savvy.
Environment
The study was held in the corner of a large, empty dining room. The participant was asked by our
greeter, Joe, to sit in front of a dining table in which a piece of paper was placed with an iPhone-like
background drawn on it. To the left of our participant sat our computer, Rylan, to the right was our
observer, Alex, and across was our facilitator Joe, who played two roles as Mazen was sick and could not
make the meeting.
The paper prototypes were photocopied cutouts all varying slightly in size, but for the most part close to
the size of a normal iPhone screen. The current screen was placed in front of the participant over the
iPhone background that remained throughout the test. The participant was told to treat the interface as
they would a normal iPhone application, but was informed that it would actually be a web application in
order to provide some context.
Tasks
Our participants were asked to do three tasks ranging from easy to difficult. The following are an outline
of the tasks we gave our participants in the order we gave them (see Appendix – User Instructions and
Script).
1. Difficult – Find the closest Starbucks and order a Grande Vanilla Frappuccino. In addition, add
this item to your favorites.
2. Easy – Now that this order is one of your favorites, make this order again.
3. Medium – Make this same order again, but include a Tall Caramel Frappuccino as well.
The tasks were chosen in this order because it makes the most sense chronologically to a new customer.
Task one requires him to, at some point, create a new account and add billing information while making
his order. Task 2 makes use of the fact that the customer added the item from task 1 to his favorites,
setting up what should be the easiest task of the three. Task 3 also makes use of the fact that the item
from task 1 is already a favorite but adds a slightly new twist in that the customer must make an order
for a friend as well.
Procedure
Our group met beforehand to discuss roles and responsibilities for the user tests. We assigned roles
based off the suggested procedure from both the slides and the Snyder chapter. They were as follows:

Greeter – Joe

Facilitator – Joe (Mazen was sick)

Computer – Rylan

Observer – Alex
On the day of the tests, we all met at a large off campus house and assembled our equipment in the
dining room. Joe, our greeter, went to find our participants one at a time. He explained to them that
they would be a taking a 20 minute usability study. He assured them their identities would be kept
anonymous, and that it was actually us that were being tested, not them. He also encouraged them to
express their thoughts about our interface, and to vocalize exactly what they are doing (i.e. “I’m clicking
on the confirm button”).
Joe also played the role as facilitator in place of Mazen who could not make it due to illness. After his
role as greeter, Joe, sat across from the participant with a script (Appendix - Script) and a set of three
note cards (Appendix – User Instructions) in front of him. He read off the script, which had a detailed,
context-based description of the task we wanted them to complete. For example, for the first task,
which was to find the closest Starbucks and order a Frappuccino, the participant was told they only had
20 minutes before their next class and they were craving a cold drink on a hot day. Based off a friend’s
recommendation they wanted to try Swickr in order to save time. After reading the task from the script
Joe placed a note card with a brief description of the task the user was expected to complete.
Our computer, Rylan, sat next the participant and said little to nothing. In front of him were a table full
of cutout pieces of paper representing each possible screen our participants could encounter while
trying to complete the tasks. Rylan’s job was to simulate the program logic by placing the correct screen
in front of the user based on his actions. If a user tried to do something that was unexpected, Rylan
placed a dialog box over the screen stating “Under Construction.”
Alex, our observer, also sat next to the participant and said very little throughout the process. His job
was to document the tests by noting interesting behavior of the participant (see Appendix – Observer
Notes). This includes errors, successes, unexpected behavior by the user, and both negative and positive
comments regarding the UI. In addition, Alex gathered basic information about the users before the
test, included age, sex, whether or not they use smartphone, and tech savvy.
After all the tasks were complete in each test, the participants were asked to express their overall
thoughts of the UI. In addition, our group would then ask them about specific issues we noticed during
the procedure, such as why they clicked a certain button over another. All of this information was
documented by our observer.
Test Measures
Since this wasn’t a scientific study in terms of keeping control variables, etc., our test measures were
more subjective than the typical user study. We did keep track of the number of errors for each
participant (although the definition of “error” is a little ambiguous) and used that as a general metric for
usability. We weighted the importance of errors on both how critical it was to completing the
customer’s goal and the task during which it occurred. We considered errors on the most common tasks,
according to our task analysis survey, to be more important than those that occurred on less common
tasks. In addition, we recorded comments by the user, both good and bad, and asked the user questions
about the UI after their test. We used all of that information to gage overall ease of use of our current
design.
IV. Results
The results section will cover the individual and overall results of our usability study, including
comments from each participant and overall trends we noticed during and after the tests.
Individual
Participant 1, a 22 year old male with moderate tech savvy and experience with both smartphones and
ecommerce, encountered five small errors, the brunt of which were in the difficult task. The most
noticeable were choosing the wrong Starbucks on the search page and finding the confirm order button.
Both those errors, however, were eventually explained away by being more confusion about the paper
prototyping procedure rather than a fault in our UI. In debriefing, Participant 1 suggested we add a
“Change” link to our order confirm page as he didn’t like the idea of clicking the back arrow to go and
make changes. See Figure 4a (below) for an example of the change we made based off his suggestion.
Figure 4a. Adding “Change” link to Order Confirmation page.
Participant 2, a self-described decently tech savvy 19 year old male with no experience using
smartphones, encountered six errors during the test. Most notable, Participant 2 did not recognize the
difference between the Featured Products that show up on homepage when not logged in and the
Favorites that show up when you are. In addition, Participant 2 mentioned he was reluctant to add the
item to his favorites because he thought the button would take him to a new page. As a result, we made
two changes to the UI. Figure 4b (below) shows the star we added next to the Favorites title to
differentiate it from the Featured Products. We thought the star would be a good choice considering it
has become a design pattern from its common use by Microsoft Internet Explorer and Gmail.
Figure 4b. Adding star next to favorites on Welcome Page.
Figure 4c (below) shows the checkbox we add next to the “Add To Favorites” link on the Order Confirm
page in order to ensure users they wouldn’t exit the current page when clicking it.
Figure 4c. Changed from button to checkbox
Participants 3 and 4, a 21 year old male and 21 year old female respectively, had a total of only 3 errors
between them. We made no changes between these two as we thought three minor errors was
acceptable for the time being.
Overall
Overall, the results from our usability study were promising. All the participants were able to complete
the tasks in a reasonable amount of time and they all had positive comments regarding the overall userfriendliness of the system. In addition, we found two trends that were particularly encouraging.
1. The number of errors dramatically decreased as we made changes based on participant
feedback.
2. The two tasks that are most common according to our task analysis survey had very few total
errors.
The first point means that our iterative changes made during the usability study were effective in
helping the later participants, and the second point means that the most common tasks, according to
our task analysis survey, were the easiest to use.
Figure 4d (below) shows the number of errors made by each participant in the order they tested.
Figure 4d. Number of errors for each participant in chronological order.
The graph shows a clear downward trend as more changes were made to the UI. After Participant 1’s
test, we made 1 change and actually saw a slight increase in errors. After Participant 2, however, we
made two more changes and saw a dramatic decrease in errors from 6 to 2. Making no changes after
Participant 3, yielded us 1 error for Participant 4.
Figure 4f (below) shows the number of errors during each task, from difficult to easy. When looking at
the results, it is important to note that our task analysis survey showed that our easiest task was by far
the most common.
Figure 4d. Number of errors for each participant in chronological order.
Although the total number of errors is not outstanding, the relative distribution is what we hoped for.
Our most common task, by a large margin, is Task 2 and only one error was encountered over four
participants. Task 1 had a total of 9 minor errors, just over 2 per person, but it is important to note that
the task proved to be relatively uncommon based on our task analysis survey. It also included registering
and adding a payment method which should not happen more that once for any given customer.
V. Discussion
The results from the experiments confirmed some of our assumptions but also brought up some
interesting use cases that we had not considered before. However, given the small sample size and
homogeneity of our users (e.g. all college students and familiar with ecommerce), we must also
recognize some the limitations of this experiment.
Many of our test subjects’ comments regarding confusion with the interface was directly tied to their
experiences and expectations on traditional ecommerce sites. For example, the second test subject did
not want to click the ‘add to favorites’ button since from his past experiences with websites was that
they navigate you away from the current page. In designing our user interface we have to closely
emulate the behaviors that users expect from traditional ecommerce websites. After, the user made his
concern about the button known, we changed the ‘add to favorites’ button to a checkbox to indicate
that adding that particular item to favorites would not take him away from his current page.
Another issue we came across was that individuals had difficulty differentiating between the featured
products and favorites sections. The main culprit of this is that prior to adding their first favorite, the
favorites section does not exist and is instead taken up by the ‘featured products’ box. However, after
they add their favorites in, the new ‘favorites’ box pushes down the ‘features products’ box. In Figure 5a
(below), the differences between the two homepages (left is without favorites, right is with favorites) is
especially hard to distinguish between to two. To remedy this, we added a star icon to indicate the
‘favorites’ box.
Figure 5a. Number of errors for each participant in chronological order.
However, there were limitations as to what this experiment could reveal. Since we did not have a very
diverse test group, we could not test how other users (e.g. older, less tech savvy) would react to our
product.
VI. Appendix
Images of All Slides Used and Other Artifacts
http://www.twango.com/media/joewoo.swickr/
Videos of Usability Tests
http://www.twango.com/media/joewoo.swickr/joewoo.10182
http://www.twango.com/media/joewoo.swickr/joewoo.10181
http://www.twango.com/media/joewoo.swickr/joewoo.10179
Consent Forms
Karl Hisch
Lauron Placentia
Mali Brynestad
User Instructions
Script
Your friend told you about a cool new site called Swickr, a web application that allows you to remotely
order food. However, before he is able to tell you how to use it, he has to go. You have 20 minutes until
your next class and you really want a grande vanilla Starbucks Frappucino with no whip.
---WAIT UNTIL FINISHED WITH TASK—
Add the item you just ordered to your favorites.
--WAIT UNTIL FINISHED WITH TASK—
You meet up with your friend the next day just before you’re about to use Swickr to order your favorite
Frappucino. You offer to add his order of a caramel Frappucino to your own.
Observer Notes
Test 1
Tester –

Karl Haisch – 22/m, moderately tech savvy, high experience with smartphones/ecommerce
Test 1 – Scenario 1

Confused at map, picked the wrong one because he didn’t notice that is was closest?

Had an easy time with no whip

Not sure if he should place the order, “usually when you place the order you go to a payments
page”

He says not to go to the login page after placing an order, but I think we should

He is somewhat concerned about putting his credit card info, “I don’t know if this is an iphone
app or the browser”,
o

Once he found out it was a browser app he was ok, because he thought it wouldn’t be
stored on his phone
Had a little trouble finding confirm order, maybe he was just closely reading over the
confirmation
Test 1 – Scenario 2

Easy
Test 1 – Scenario 3

Chose to customize to the tall – it didn’t say what size it was
Test 1 – Comments

“When is the first time customize shows up?” – Have the favorites go straight to the shopping
cart

Put a Change Order on the confirm order, didn’t like going straight back through the pages
Test 2
Tester –

Ryan McKinney – 19/m, never used an iphone/smartphone, never used a mac, “decently” tech
savvy
Changes for Test 2

Added “Change” link to the confirm order page, does the same thing as “Cart button” but
doesn’t feel like moving back
Test 2 – Scenario 1

Didn’t notice that those were “featured products”, clicked the Starbucks to find Starbucks – we
brought up the search page

Didn’t know how to say no whip
o
tried to change milk, but noticed it wasn’t right
o
then clicked notes and typed in no whipped cream

Noticed the favorites, that’s good

Had no problem typing in all his credit card info

Didn’t click on the University Starbucks link at the end, maybe he knew where it is
Test 3 – Scenario 2

Didn’t notice that those were “favorites”, but clicked it anyways because he did that last time,
didn’t really talk about how it behaved very differently

Everything else was easy
Test 3 – Scenario 3

Used the favorites again

Used the “back to the cart button” instead of our new change button, was fine with that

Took a while to find “order more” on the cart page

Tried to click on “tall” to change size on the shopping cart, but it didn’t work, then clicked
customize
Test 3 – Comments

Didn’t want to click favorites because he thought it would take him to a new page, we should
switch to checkbox-like thing

Maybe use “Stores near me” on front page, he did see featured products but was a little
confused on what it meant

Didn’t mind putting billing info, even when asked about it
Test 3
Tester –

Lauron – 21/m, has a regular phone, uses ecommerce, pretty tech savvy
Changes for Test 3

Adding star to favorites to differentiate from Featured Products

Changed the “add to favorites” to a checkbox so people wouldn’t feel hesitant about clicking it
Test 3 – Scenario 1

“Should I type in an address?” Typed in the address of the Starbucks he wanted

“Why wouldn’t you have a button for just Find the Closest Starbucks”

Changed his search to Starbucks after thinking about it

Thought he was done after pressing “Add order to favorites”, also didn’t click done after
customizing order

Had no problem with typing in credit card info

“Does it say when it’s going to be ready?”
Test 3 – Scenario 2

Easy
Test 3 – Scenario 3

Pressed order details “Change” on the confirm page

Easy
Test 3 - Comments

“I don’t know how much I’d use it”
Test 4
Tester –

Mali – 20/f, uses a smartphone, medium tech savvy
Test 4 – Scenario 1

Tried to go back to the main page, and hit view more

Got the right Starbucks on the second try
Test 4 – Scenario 2

Easy
Test 4 – Scenario 3

Clicked the “Change” button on the confirm page

Easy
Test 4 – Comments

The first scenario was confusing just getting used to the idea of an iPhone

Didn’t know you could click on the map icons
Download