Posted: December 11, 2014
A good user testing engagement will challenge both your site’s functionality and, just as important, your own assumptions about how users browse, what they read (and don’t), where they click and what ticks them off.
Whether you’re planning a redesign or not, occasional user testing on your current site can inform your design choices, editorial strategy, lead generation and nurturing efforts, even your brand strategy, not to mention how you manage your web teams and budgets.
Despite those happy returns, most schools rarely conduct regular user testing as part of their iterative strategic process. Several misconceptions are to blame:
- User testing is only for those who are planning or are in the midst of a site redesign.
Sure, user testing is a crucial step throughout a redesign process. But waiting until your site needs redesigning to perform user testing is like waiting until the night before your dentist appointment to brush your teeth. Over time, things just decay.
- User testing requires deep web usability expertise and lots of expensive, complicated technology.
Eye-tracking software. Click-by-click heat mapping. Quantitative analysis and complex algorithms. Usability experts sometimes make the process sound like something that would give Dr. Emmett Brown a panic attack. There are much simpler ways to conduct user testing that produces actionable insights for your website.
- User testing sounds great, but we’re already overstretched and under-resourced. We just don’t have the time or money to do it right.
Actually, if you have two staffers, a handful of test users, a computer and some coffee, then you have everything you need to conduct a fun, productive user testing session.
To help dispel these misconceptions, we’ve compiled a DIY user testing toolkit, including a step-by-step sample session plan, giving you everything you need to conduct successful user testing regardless of your experience level or budget.
A Note on Methodology
Before diving into the kit itself, we should note that there are dozens of methodologies and approaches to user testing, each with its own required toolkit and process. If you’re new to user testing or working with a limited budget, we recommend a qualitative approach that combines user interviews with task analysis.
In essence, this approach involves giving test users a set of tasks to complete on the website, observing and documenting patterns or trends regarding both their behaviors and attitudes. You can observe and document things like:
- Where do they click first when asked to perform a certain task?
- Do they search or use the main navigation?
- Do they read pages or scan for links and headlines?
- Where are their specific pain points?
- Are they relaxed or frustrated while performing the task?
- Do they seem confident or doubtful of their ability to find information?
- Do their attitudes change after clicking around for a while?
We recommend interviewing the test users as they go along, encouraging them to self-narrate and asking them specific questions about their tastes, preferences, likes and dislikes, assumptions, reasons for clicking or not clicking, and anything else that strikes you as notable.
A Website User Testing Toolkit
The following essentials are all you need for a productive in-house user testing engagement:
A goal-driven strategy
Your strategy - including who you choose as test subjects and what tasks you ask them to perform - should be tailored to helping you achieve your primary goals. What are you hoping to discover? Are you primarily interested in testing a certain site feature like site navigation, design or content? Are you interested in evaluating the entire site or a specific section? Are you hoping to learn about a specific audience’s user patterns and attitudes? Answering these questions will help you set specific goals that should inform the ways you utilize the rest of the tools in this kit.
Useful tip: If you’re only conducting a single day of user testing, we recommend focusing your insights on a particular section of the website or type of task. By narrowing your focus, you’ll give yourself an opportunity to glean deeper, more precise takeaways.
You can perform successful user testing alone (especially if you record your sessions as we discuss below), but we strongly recommend dividing responsibilities with a partner. While one of you focuses on the relational work of guiding users through their sessions, the other is free to take notes and manage the timer. Plus, when it comes to reflecting on the experience and documenting themes and trends, two heads are much better than one.
Useful tip: Choose a partner who you feel comfortable could lead sessions as well as take notes. That way the two of you can rotate responsibilities, keeping your eyes and ears fresh throughout a long day.
Choose test users who represent the user groups or target audiences you’re most interested in studying or engaging. Often these include prospective students, prospective parents, current students, current faculty and staff, prospective faculty and staff, alumni and prospective donors. While there is no perfect number of test users, the more users you can test, the more accurately your data will reflect their respective user group. Assuming 10-15 minute test sessions and a good full day, the average two-person team can test 18-20 subjects in a single day.
Useful tip: If you have the budget, little incentives help nudge people to accept your offer to participate, especially those external users like prospective students, parents and alumni. A $5 gift card, a coffee mug from your campus shop, or a free portable phone charger are all good ideas. Free incentives like extra course credit are great motivators for students, too.
Before your user testing begins, make sure to have a set of ready-made tasks or questions prepared to drive your sessions. Depending on your goals, we recommend using one of the following two approaches to choosing session tasks (or a combination of the two):
Testing specific site elements: First, you may be hoping to test a very specific aspect of site functionality, like how easy it is to sign up for a campus visit. If that’s the case, build tasks that challenge users to find different ways to perform that action (test different routes to the sign-up page, how easy it is to fill out the form, whether users are seeing that new promo you’ve just launched, whether they can reach out to a real counselor during the sign-up process, etc.).
Testing whether the site facilitates users’ goals: You can also assign tasks to determine whether the site makes it easy for users to achieve their goals. Start by documenting what you assume to be a particular audience’s primary user goals. Then build some sample tasks related to those goals.
For example, you may assume prospective online students’ primary goals are: (1) to discover whether a certain program or offering is available online; (2) to determine whether online program tuition costs are comparable to seated program tuition; and (3) to learn a bit about the online students’ experience, like how easy the interface is to use. You can then assign tasks that ask test users to find information related to those goals.
Useful tip: With the second approach - testing assumed user goals - you may learn that what you THINK users want from your site actually isn’t important to them at all. In that case, give test users one or two of your pre-determined tasks, but also assign a task or two based on what their ACTUAL user goals are. Then document those critical insights about that audience.
A full day
With one full day, the average team of two people can test enough users to support a good qualitative analysis of your site’s performance. Sure, coordination and planning on the front end requires attention, as does analyzing and publishing your findings. As with any worthy investment, though, the results of a single day of user testing, insights about how real people actually use the site you’re building and managing, will create long-term efficiencies that far outweigh the time you put into it.
Useful tip: To streamline the post-test analysis process, we recommend spreading your user testing out over 2 days instead, building 15-minute windows in between each session to quickly document global themes and takeaways. Jotting down those themes while they’re fresh on your mind will make the post-testing analysis process much more efficient.
You don’t need a high-tech laboratory or even a special computer to perform user testing. Any normal desktop computer, laptop, tablet or phone will do. We recommend starting each session by making sure users are comfortable with your test device. If not, you might give them a quick tutorial or have a back-up PC available just in case. Don’t worry if users aren’t as computer literate as you might expect. Testing users who are uncomfortable with certain devices can teach you a lot about whether your site makes life easy on ALL users, no matter how tech savvy.
Useful tip: It’s a good idea to have a wireless mouse present if you’re using a laptop. There is still varying level of familiarity with the way different track pads operate, so offering users a point-and-click mouse option eliminates unnecessary confusion and helps them focus on the tasks you assign.
Recording software is not an âessentialâ item in your toolkit. The important thing is that you take good notes. Still, recording your sessions is an excellent way to permanently document your body of work, especially if you’re hoping to persuade senior leaders to make changes to your website strategy. After all, a picture - and better yet, a video - is worth a thousand words, and showing your bosses a short video compilation of 20 test users griping about the same problem can create a powerful emotional effect.
There are plenty of great, affordable recording software options available. Our tool of choice is called ScreenFlow, a screencasting and video editing software that records a user’s face, voice and screen activity throughout the session with a ripple effect that makes each click easy to see.
Useful tip: The video files that most screencasting software produce are very large. Make sure your hard drive has space to hold a day’s worth of testing sessions, or come prepared with an external hard drive for backup.
In order to get 18- 20 sessions done in a single day, it is crucial to hold each session to your 15-minute limit. Keep a close eye on the timer throughout. Listen carefully for moments when the conversation stalls and speak up to keep the session moving along.
Useful tip: Make sure the timer is inconspicuous. No matter how strongly we stress that we’re testing the site (and not the user), the experience will always feel a bit like taking a test with a teacher standing over your shoulder. Turning over an hourglass timer or compulsively glancing at our watches only makes that sensation worse.
A (good-sized) room
The more comfortable your users feel, the better and more insightful their participation will be. Reserve a room that’s "neutral" territory instead of inviting test users to your office. We recommend reserving a medium-sized conference room with enough space to make the experience seem a little less claustrophobic (a cramped workstation surrounded by two note-taking observers can seem a bit stuffy). Also, the extra seating will give early birds a place to wait for their session without feeling like they’re encroaching.
If you have the budget, keep the room stocked with hot coffee, ice-cold lemonade, a cookie tray or some crackers will not only help your test users relax and enjoy the experience, it’ll also give you and your partner some much-needed fuel throughout the busy day.
Useful tip: Be sure to schedule time for you and your partner to take a lunch break! The meal and downtime will help rejuvenate you and keep you sharp in the later sessions.
Sample Session Plan
A clear session plan helps ensure your user testing sessions are both productive and efficient. The plan isn’t necessarily a script. The best test facilitators internalize their goals so fully that they can navigate sessions without glancing at the plan. Still, a written plan holds you accountable to strategy and timeline, especially during those later sessions when fatigue and redundancy have taken their toll.
Here is a generic ten-step sample plan we at VisionPoint have used to great success in recent user test sessions:
Sample Session Plan (max 15 minutes)
Step 1: Meet and greet. Help the user feel comfortable and relaxed. Offer up any snacks or beverages.
Step 2: Explain the purpose and the process of user testing, expressing your gratitude for their participation. Ask them to be as vocal as possible during the session, narrating as they go along about what they’re seeing and thinking, why they chose to click something, what they’re expecting to find beneath certain buttons or links, etc. All those insights will be useful.
Step 3: Emphasize that "we’re not testing you, we’re testing the site." Tell them there is absolutely no way they can "fail" or mess this up. Make sure they know that if they struggle with a task, it indicates something about the site’s performance, not their performance.
Step 4: Start recording. (Note-taker should check to make sure this step is taken, as it’s easy to overlook when you’ve done 15 sessions in a row).
Step 5: Conduct a brief on-camera interview about typical user behavior. Ask the person to introduce their name, their relationship to the institution, and to describe the ways they typically use the website. Ask them which devices they most commonly use.
Step 6: Ask about the person’s general impressions of the site. Do they have strong feelings about the site’s usability coming into the session? Do they not? What do they like and not like? (If they don’t have strong answers, have them glance at the homepage and ask about their initial impressions of certain elements like design, navigation nomenclature, content, etc.)
Step 7: Task 1 - Base Task 1 on something they mentioned related to their typical user behaviors. If, for example, they mentioned using the site to evaluate cost information, then have them try to find something like the "real" cost of a semester at the university.
Step 8: Task 2 - Base Task 2 on either something else they mentioned related to their typical use, a logical follow-up task to Task 1, or turn to your predetermined user tasks.
Step 9: Task 3 - Base Task 3 on a predetermined user task.
Step 10: Closing question (ask as many as time allows):
- If there is one thing you hope never changes about the current site, what would that be?
- If we were to redesign this site (or section), what would "success" look like to you?
- Now that you’ve clicked around for a bit, are there any final thoughts you feel compelled to share about what you like, don’t like, or want to see change about the site?
Want to Learn More?
If you'd like to discuss user testing strategy or explore ways to deepen this toolkit, don't hesitate to reach out!