Assignment 8: Test your prototype

Brief

The goal of this assignment is to test your prototype with people and learn from that experience. To get the kinks out of your application, first test it with a couple of people.

Assignment

Watch people use your prototype

Watch at least two different people use your interactive prototype. These do not have to be planned out ahead of time, but try to get people who are somewhat similar to the real people who will be using your application.

Just like when you conducted the Heuristic Evaluations, each team member will have its own role during testing. One person will be the facilitator, one will be in charge of taking notes/photos/video/etc. Any team member who is not currently in one of those two roles should observe and take notes. This time your user will not be writing down the problems they find for you. It's your job to learn what the people testing your prototype are thinking; the feedback they provide you will be invaluable for your next iteration. Your goal is to find ways to improve your interface. Look for breakdowns and pain points in your interface and try to understand what the problems are and how you might fix them.

When possible, modify/update your prototype before running the next participant. Keep the task consistent between participants: use the same script, follow the same protocol, answer questions in the same way. This is a fast study, not a formal one. Still, any study requires a bit of planning: write down what you are going to ask people to do and not vary that between users. Submit your plan, task description, and signed consent form for each participant. Here is a consent form that you can modify to use for your study. Immediately after each test, do a quick debrief with your team and write down any reactions or thoughts that came up. You will most likely forget them, so it's important to write them down right after the test.

Submit a photo or sketch of each participant testing your prototype. As with the needfinding assignment, these photos and captions should show breakdowns and design opportunities. How you compose your photo will depend on the app/task. If the key thing is that it happens in a particular location (say in a grocery store), make sure to show the context. If the key issue is what's on screen, take an over-the-shoulder shot of the person interacting.

Results

After testing, take some time with your team to reflect on your findings. Go through all the notes and other recordings. Try to be objective; don't write problems off. Discuss as a team and try and define some general patterns in people's behavior. When you identify some interesting points, talk deeply about them - ask each other questions, recreate the different tests, analyze the decisions people made, other paths they could have taken, and so on. Let your insights guide redesigns for your next prototype iteration. The end product of your discussion should be a list of changes that you will implement. Make this list detailed and understandable to people outside of your team. Of all the bugs that you have identified, fix the bugs that are either small and easy to fix or too severe to ignore. Make sure that you do this before moving on to the next step of this assignment.

Create an alternative design

Select one interface element of your prototype and redesign that part. You will then implement the redesign of this component, so keep this in mind when considering the scale and difficulty of your redesign. This is not a huge redesign of your prototype. Select something small and manageable in scope. Designers often go through many many different versions of prototypes before their final design, so try to get familiar with the process (given our time constraints for the course).

Plan to test that alternative online

Next week, you will use the Web to gather usage data about your application by launching an A/B test online. Refer to the lecture notes regarding controlled web experiments (e.g. A/B testing, control groups, etc.). Think about what you are asking, and how to prepare a study to answer it. Try to think of A/B tests that will give you information that this week's user testing didn't (or couldn't). Next week, when you launch your A/B test online, the alternative redesign you implemented this week will be the 'B' of your A/B test. You will need to have access to both versions of this component to run your online tests - don't write over your old code!

Student Examples

Here are two student examples from last year. Keep in mind that last year, students had to submit paper prototypes of their redesign in addition to the implemented version.

  • Example 1 - This is an example of an A+ level assignment. This group obviously put a lot of thought into their in-person test, and was able to motivate their redesign from the conclusions they drew from the in-person test.
  • Example 2 - This is an example of a B level assignment. This group lost points for not including their consent form for the in-person test. We also wished the feedback was more substantive beyond obvious usability bugs (one of which had been mentioned by the TA in a previous assignment). For the online test description, we were not convinced that measuring click rates was the right metric to measure success.
  • Example 3 - This is an example of an A level assignment. We liked the clean and well captioned photos for each participant testing their app. They also tested more than the required two users.

Submit

  • Captioned photos for each participant testing your prototype, a study plan and signed consent forms.
  • A list of changes you will implement in your next iteration. For each change, include a brief explanation for why you selected it. Describe it with enough detail that someone outside of your team can understand it.
  • URL of the implemented alternative redesign of one interface element.
  • Description of online test. In 2 to 3 sentences, describe the online test you will run for the next assignment. How will you measure which alternative is better? Remember, what people do is different than what they say.
  • Last week and this week's PDF of development plan. We recommend submitting a PDF of your Google Spreadsheet. This gives you a snapshot for comparison.
Submit

Evaluation criteria & Grading rubric

Category Nope Adequacy Proficiency Mastery
In-person Test
4 points
0: No user study was performed. 1: Photos of only one participant, the study was poorly planned, or study materials were not included. 3: Photos of two or more participants. The study captured some information, but is incomplete. The captions don't reflect the breakdown shown in the photos or the photos don't demonstrate a breakdown. 4: Photos of two or more participants. Study materials (photos w/captions, plan, consent forms) are complete and reveal some very useful information. The photos and the captions show clear breakdowns.
Planned Changes based on Test
2 points
0: No changes listed or irrelevant changes. 1: The student presented several possible changes derived from the user testing data, although not all of the changes were useful or some important changes were overlooked. 2: The student suggested several possible changes based on the user testing, all of which were important and directly addressed the problems identified in user testing.
Alternative Design
3 points
0: No redesign or irrelevant redesign. 1: The redesign doesn't address an issue found in testing and seems unlikely to satisfy a real user need. 2: The redesign addresses an issue found in testing, but seems unlikely to satisfy a real user need. 3: The redesign addresses an issue found in testing to satisfy a real user need.
Description of Planned Online Test
2 points
0: No description submitted, or the online test is not well thought out. 1: The online test is designed to produce some useful data, however, it may not be motivated by real usability issues. 2: The online test is clearly motivated or innovative in a way that will provide rich and interesting data.
Updated Development Plan
2 points
0: No updates or only minor changes to plan. 1: Plan is mostly updated, but is lacking some detail or deadlines seem unreasonable. 2: Plan is detailed and reflect progress, new tasks, and any changes to previous tasks.

Outside the Box
1 point. Up to 5% of submissions.
The alternative redesign is more than a structure change. It directly highlights a different UI overall, and gives a different user experience that competes well with the current version (such that you are almost unsure about which UI would actually perform better)