Assignment 8: Test your prototype


The goal of this assignment is to test your prototype with people and learn from that experience. To get the kinks out of your application, first test it with a couple of people.


Step 1: Watch people use your prototype

Watch at least two different people use your interactive prototype. These do not have to be planned out ahead of time, but try to get people who are somewhat similar to the real people who will be using your application.

Just like when you conducted the Heuristic Evaluations, each team member will have its own role during testing. One person will be the facilitator, one will be in charge of taking notes/photos/video/etc. Any team member who is not currently in one of those two roles should observe and take notes. This time your user will not be writing down the problems they find for you. It's your job to learn what the people testing your prototype are thinking; the feedback they provide you will be invaluable for your next iteration. Your goal is to find ways to improve your interface. Look for breakdowns and pain points in your interface and try to understand what the problems are and how you might fix them.

When possible, modify/update your prototype before running the next participant.

Step 2: Develop a protocol

It is important you keep the task consistent between participants: use the same script, follow the same protocol, answer questions in the same way. This is a fast study, not a formal one. Still, any study requires a bit of planning: Prepare an outline of how the user test will be run, written instructions that you will read to the users, and any other materials (e.g. questionnaires, interview questions) that will be used during the session. This is called your experimental protocol, which is a step-by- step guide on how to run the experiment. It should be detailed enough that someone else could take the protocol and run your experiment without you being there. It ensures that you carry out the experiment in the same way for each participant.

Your experimental protocol should cover:
  1. Preparation and setting up
  2. Gaining informed consent (Even if your participants are close friends, you will need to create a consent form. You can use this consent form to create one if you find it helpful.)
  3. Executing the test, who does what
  4. How your observations will be recorded
  5. Debriefing the participant and a team debrief

Submit your experimental protocol and signed consent form for each participant. Immediately after each test, do a quick debrief with your team and write down any reactions or thoughts that came up. You will most likely forget them, so it's important to write them down right after the test.

Submit a photo or sketch of each participant testing your prototype. As with the needfinding assignment, these photos and captions should show breakdowns and design opportunities. How you compose your photo will depend on the app/task. If the key thing is that it happens in a particular location (say in a grocery store), make sure to show the context. If the key issue is what's on screen, take an over-the-shoulder shot of the person interacting.

Step 3: Analyze your results

After testing, take some time with your team to reflect on your findings. Go through all the notes and other recordings. Try to be objective; don't write problems off. Discuss as a team and try and define some general patterns in people's behavior. When you identify some interesting points, talk deeply about them - ask each other questions, recreate the different tests, analyze the decisions people made, other paths they could have taken, and so on. Let your insights guide redesigns for your next prototype iteration. The end product of your discussion should be a list of changes that you will implement. Make this list detailed and understandable to people outside of your team. Of all the bugs that you have identified, fix the bugs that are either small and easy to fix or too severe to ignore. Make sure that you do this before moving on to the next step of this assignment.

Step 4: Create an alternative design

Select ONE component of your prototype and come up with a redesign for that part. The redesigned component needs to be significantly different from the original design and likely to satisfy a real user need either not satisfied or satisfied differently in the original design. This is NOT meant to be a huge redesign of your prototype. We want you to select something small and manageable in scope (similar to the alternative website designs shown in lecture). Designers often go through many many different versions of prototypes before their final design, so we want you to get familiar with the process (given our time constraints for the course). Mock up a few screens of your redesign that you will be able to use with a user, and keep it simple. Electronically create and submit a URL to your redesign (build the interactive screens from your paper prototype). Remember to keep the URL to this redesign separate from your original application (you will need both for testing). As you think about how to revise your design, it may help you to sketch it on paper before you implement it in code.

Step 5: Plan to test that alternative online

Next week, you will use the Web to gather usage data about your application by launching an A/B test online. Refer to the lecture notes regarding controlled web experiments (e.g. A/B testing, control groups, etc.). Think about what you are asking, and how to prepare a study to answer it. Try to think of A/B tests that will give you information that this week's user testing didn't (or couldn't). Next week, when you launch your A/B test online, the alternative redesign you implemented this week will be the 'B' of your A/B test. You will need to have access to both versions of this component to run your online tests - don't write over your old code!

In Studio

In studio, you will present your ideas informally to the other teams: What were some major findings? What changes did they translate to? What are you going to do moving forward? Then you will work with your studio leader to prepare for A/B testing!

Student Examples

Here are two student examples from last year. Keep in mind that last year, students had to submit paper prototypes of their redesign in addition to the implemented version.

  • Example 1 - This is an example of an A+ level assignment. This group obviously put a lot of thought into their in-person test, and was able to motivate their redesign from the conclusions they drew from the in-person test.
  • Example 2 - This is an example of a B level assignment. This group lost points for not including their consent form for the in-person test. We also wished the feedback was more substantive beyond obvious usability bugs (one of which had been mentioned by the TA in a previous assignment). For the online test description, we were not convinced that measuring click rates was the right metric to measure success.
  • Example 3 - This is an example of an A level assignment. We liked the clean and well captioned photos for each participant testing their app. They also tested more than the required two users.

What’s this for? A UX agency perspective

By Mike Davison, Community TA and UX Project Manager

Testing your high fidelity prototype with users closes the circle. It is vital to ensure your solution meets the needs identified during the first assignment, and that the agency has not simply spent months drifting further from the problem.

It also allows you third party reflection and suggestions for tweaks to the design. Everything we learn here and correct is a problem we don’t have to live with because it has already been coded and is too costly to’s a valuable phase of the process.

Remember this - feedback is not criticism, feedback is not personal. User centred design works best when pride is left aside, and the feedback of others is incorporated into your design thinking!

Common "I like" Feedback

The following statements are common feedback given on this assignment. We call these statements 'I like' feedback because they are a way to express positive aspects of the submission. You could think of these as elements to aim for.

  • Redesigned an interesting and useful part of their app
  • Was thoughtful about the design breakdowns they found
  • Ran well thought-out user tests
  • Tested tasks important to users of the app
  • Created a thorough set of testing materials (plan, script, etc.)
  • Took interesting photographs
  • Had prototypes that were well-designed and bug-free


  • The URL of the original prototype you tested.
  • Your experimental protocol and signed consent forms, as well as any materials you gave to the user as part of your tests (either as text, PDF or a scanned image). Examples may include scripts you read aloud to the user, questionnaires, surveys or other materials you wrote. (Experimental Protocol & Documentation)
  • Captioned photos for each participant testing your prototype. (Photo Documentation)
  • A list of changes you will implement in your next iteration. For each change, include a brief explanation for why you selected it. Describe it with enough detail that someone outside of your team can understand it. (Planned Changes Based on Test)
  • URL of the implemented alternative redesign of one interface element. (Alternative Design)
  • Description of online test. In 2 to 3 sentences, describe the online test you will run for the next assignment. How will you measure which alternative is better? Remember, what people do is different than what they say. (Description of Planned Online Test)
  • Last week and this week's PDF of development plan. We recommend submitting a PDF of your Google Spreadsheet. This gives you a snapshot for comparison. (Updated Development Plan)
Submit your formatted pdf here

Evaluation criteria & Grading rubric

Category Nope Weak Proficiency Mastery
Experimental Protocol & Documentation
2 pts
Protocol, consent forms, or documentation missing/sketchy. OR participant names not anonymized in report. Complete but poorly conceived protocol. Consent forms included. Scant documentation. Protocol well-designed. Consent forms included. Fully provided documentation.
Photo Documentation
2 pts
Photo(s)/Sketch(es) of less than two participants. Photo(s)/Sketch(es) of two or more participants. Captions missing or vague. Photo(s)/Sketch(es) of two or more participants. The photos show and the captions thoroughly describe clear breakdowns.
Planned Changes
2 points
No changes listed or irrelevant changes. Presented several possible changes derived from the user testing data, although not all of the changes were useful or some important changes were overlooked. Suggested several possible changes based on the user testing, all of which were important and directly addressed the problems identified in user testing.
Alternative Design
3 points
No redesign or irrelevant redesign. No URL to interactive prototype. The redesign doesn't address an issue found in testing and seems unlikely to satisfy a real user need. The prototype is incomplete and barely interactive. The redesign addresses an issue found in testing, but seems unlikely to satisfy a real user need. The prototype is somewhat interactive, but not ready for user testing. The redesign addresses an issue found in testing to satisfy a real user need. The alternative prototype is fully interactive and ready for user testing.
Online Test Plan
2 points
No description submitted, or description omits quantifiable measures. Test will quantitatively measure user behavior. However, the task is not motivated by real user needs, or the measure cannot be compared with a chi-squared test (or alternative that the submission proposes). The task is motivated by a clear user need and the measure can be compared with a chi-squared test (or alternative that the submission proposes).
Updated Development Plan
2 pts
No updates or only minor changes to plan. Plan is mostly updated, but is lacking some detail or deadlines seem unreasonable. 2: Plan is detailed and reflects progress, new tasks, and any changes to previous tasks.
Outside the Box
1 pt. Up to 5% of submissions.
Not just a small tweak or bit of polish. The alternative redesign introduces a creative, conceptually different, and likely much improved user experience.