CS 477: Project #0
|
User testing for Beginners: Yeah, it's software...but does it work?
|
Points: 100 pts |
|
Rules: Teams of three
Post Date: 2/2/2016
Due Date: various, see deliverables below
|
Overview: In the typical user interface course, early focus is on design and theoretical background, and usability testing gets pushed back to the "practical techniques" piece later on. Theory (of design, of cognition, of perception, of performance, of psychology) is certainly important and will greatly enhance your ability to quickly identify good and bad interfaces, and to find and solve problems while testing interfaces. But ultimately, just getting your hands dirty and doing some testing is a great way to (a) get a basic feel for what it's about and see the potential in doing it and (b) give you a mental framework for understanding what all this theory we'll learn is good for. So, sure, we don't really know what we're doing yet...but we'll just jump right in and give it a try! Specific learning goals include:
- Learn how a usability lab works, learn the basic equipment setup and operation
- Learn how to create an effective "lab manual" for testing a software product.
- Learn about recruiting and managing your test subjects
- Learn about how to efficiency present test results in written and oral form
Assignment: The goal of this assignment is straightforward: learn the basics of applied usability testing by simply running some tests on an existing application. To start, get together within your team and pick an application to test. Here are the criteria for choosing what to analyze:
- May be any software application: a desktop app, a web app, or a mobile application. Obviously it must be a software that is aimed at and allows you to do something, some task, i.e., not just a web page. So it has to have some purpose and interactivity.
- Should not be a "mainstream" application, e.g., Facebook, Ebay, Amazon, Microsoft Office products, etc. The point is to be able to find "naive users" to test with...meaning users that don't already know and have mastered the interface. So you need a piece of software that your target users don't know...or at least don't know well.
- I recommend the you choose something halfway simple, i.e., not packed with a gazillion complex functions. You want a software that has maybe five key functionalities/tasks that you can test...without confusing things with having a million other functionalities cluttering the interface. What you're shooting for would be something with about the size/scope/complexity of a typical smartphone app (whether it's actually a smartphone app or some other platform). So Photoshop would be a poor choice, while an app for doing simple sketching might be good.
Within the spectrum of usability testing, you will be doing what is formally known as an "Assessment Test". We'll define this in more detail later: pros/cons, placement within the broad spectrum of testing approaches, etc. For now, you just need to know that:
- You will recruit three pairs of participants for your testing. Here some guidelines for effective selection of participants.
- You will bring pairs of participants into the lab to test drive your targeted application. These tests will run 10-15 minutes.
- You will have recording equipment set up to record the interaction. See the overview of how to use our UI lab for details. We'll also do a walk through next week in class.
- You will give the participants a "lab manual" to work through; this drives the testing process by giving them tasks to accomplish. Here are some guidelines for building a good lab manual to help you out.
- You will observe remotely and take notes during the test. Then you'll review "breakdowns" that you noted again later at home, replaying the video to figure out why exactly they had trouble. I've given you some good tips on analysis process in the description of testing process.
- You will write up your results in a brief "usability report". Here is an outline of what a usability report looks like. And you will present these results to the class for discussion and review.
I will expect you to run at least three usability tests; this means three pairs of participants tested and analysed. Here is a more detailed step-by-step description walk-through of the testing process to help guide your efforts.
Deliverables:
- ASAP: Group lead sends me an email with "CS477: Project0 team info" in the subject line, and containing: group name, members, chosen app to review, brief one paragraph description of this app, what it does, and why you chose it. Your team name, from now on, is the name of the app that you are reviewing, e.g. "Team Mac Calendar"; put this on all coversheets along with members!
- Fri. Feb 12: Submit hardcopy of your "lab manual", prefaced by an intro page where you briefly introduce the app you are testing, introduce and justify what you see as the key uses/functions of this app, and then overview the sequence of tasks that your lab manual aims to test. So your lab manual is the physical "here's what we'll make them do", and the cover page is your explanation and rationale for why you designed it that way (just as detailed in guidelines linked above).
- Wed Feb 17: A list of test participants you've recruited, along with anticipated testing dates/times. This should be prefaced by a page on which you briefly describe and justify your recruiting process. Start with the little overview of your app again (recycle from lab manual deliverable). Then talk about your strategy/priorities for recruiting participants...with respect to your app's target population. This gives an idea of match (or not) between your targeted group and real end-users. Then follow this with an overview of the characteristics of the participants you actually recruited, i.e., a table listing things like: major, age, previous computing experience, previous experience with apps similar to target app, anything else relevant to your particular app.
- Wed Feb 24: Presentation of results, plus turn in Usability Report (hardcopy, professional). See the report guidelines to get idea of overall structure/tone/aim, then adapt to what you need to report for your particular app.