What are the top concerns with running user tests? I often hear budget and schedule cited at the top of the list—even by teams who know just how valuable a usability test is! I’ve come to the conclusion that these concerns are often unfounded, for one simple reason: user testing can take so many forms, including many that are cheap and quick.
I’ve previously written about how to help the C-level and organization buy into testing real product users, and how to prepare for user testing. Now it’s time for step three: let’s look at actual, affordable usability tests to try.
All of the tests offered below have a free and a paid version. What your team tries will depend on your company’s budget, as well as the type of project being tested. We offer these variations because no matter the means, we believe any user testing is better than none.
We’ll review 6 types of user tests—all use direct interaction with a user to learn about their interactions, and all are things you might never have considered as “usability” tests:
- On-site surveys
- Heat maps
- Video reviews
- Follow ups
- One-click feedback
On-site surveys intercept users while they are on a website, and can specifically trigger when a user is interacting with a certain page. If your organization wants to use on-site surveys, be sure to consider how these may affect the users – will it annoy them? confuse them? Some survey placements obscure or interrupt a user’s view of the content he came to see and full-screen layovers often (not always) increase bounce rate.
Be sure to watch analytics data carefully to make sure users aren’t leaving the site when they encounter the survey. One way to avoid high bounce rates on a survey page is to employ time-delayed surveys — these are programmed to appear after a few seconds, allowing the user to make sense of the page before interrupting their flow. However, time-delayed surveys can also be irritating (our hand-coded survey got more answers complaining about the survey than answers to the questions we were asking), so keep them short and sweet.
Another element that can increase bounce rates are surveys that drag on. To prevent that, use the (pre-defined) goals of the survey to identify one answer you think you can get from that page, and design a survey that requires as few user keystrokes, seconds, and brain cells as possible.
At Clearlink, we’ve used a few different tools: Qualaroo is a highly customizable tool but may stretch some budgets; Google’s paid user tools come in at a reasonable cost-per-user but can be limited in customizations; and we’ve hand-coded our own on-page surveys, but found they take a lot of time and finesse before they work well.
Sometimes, it’s best to get out and talk to some real users. What better place to do this than at a coffee shop? You can observe a variety of people, and engage with those who most closely align with your customer profiles.
Before you go: write out one task per customer profile, finding tasks that might take some reading or a couple clicks to complete. (Pro-tip: Have someone unrelated to the project read the tasks and tell you if they make sense to the average person. Or run them through the Hemingway App for tips on making them simpler and clearer.)
With your customer profiles and tasks in hand, take a computer and $100 to Starbucks or the local coffee shop. Identify patrons who fit into one of your customer profiles (observe their vehicle, clothing, device type, age, etc.) and politely ask if you can buy their coffee for a few minutes of their time. Some testers worry that you won’t be catching users in the right mindset, but the benefits outweigh the concerns: you’ll get a wider variety of users and demographics at a coffee shop, and you’ll be able to relate users to your customer profiles more readily.
Offer them the single task that coordinates with their customer profile (they don’t need to know this part), and observe them as they navigate the site. Take notes. Mark any place they hesitate before clicking, note when they click and immediately click back to the previous screen, write down any questions they ask you for clarity, and ask them what’s troubling them when they pause too long.
Bonus: There’s an alternate to the Starbucks method – it’s the neighborhood bar method. Websites, apps, and your bagel maker should all be easy enough to use that a drunk person can figure them out. Take your app or mobile website down to the local bar, buy pints for a couple different folks in your customer profiles, and have them try a task. You’ll learn right away where the complicated functionalities are, where the hard-to-read text is, and where the too-small-for-human-fingers buttons are.
The one downside to the Starbucks method is that users are out of context—they weren’t planning to use the application today. Heat maps, on the other hand, are fully contextual. Heat maps are a form of tracking software that record your users’ activities on specific pages. The team can see a cumulative overview of where users are clicking, how far they scroll, and even user activities by device. With heat maps, you can often provide the “why” to the “what” you see in Google Analytics.
When you pair heat maps with Google Analytics traffic and bounce rate data, you can get a good idea of why some pages work better than others. For example, if we observe a landing page with three calls-to-action, we may discover that while it has good traffic, there is a high bounce rate and users are clicking all over the place. If we strip down the landing page to a single CTA, and it maintains the same traffic, but reduces the bounce rates while increasing user success – we’ve learned that the simpler design is more effective.
There are some very affordable heat mapping tools available that will provide good insight without breaking the bank. I’ve used Crazy Egg and Lucky Orange; both are fairly intuitive. Several free versions or free trials exist, like Ptengine and Hotjar.
Another way to view users in context and to add a layer of humanity back into user testing is the video review. Users record themselves walking through a website and speak aloud their thoughts as they try to achieve a task.
The beauty of these services is hearing users speak their thoughts aloud – they can explain why they did or did not click on a button, what part of the copy doesn’t make sense to them, or the question they really had that they couldn’t find the answer to. Although many studies have found that our brains aren’t wired to be able to say aloud what we prefer or really think, I’ve found that I’ll get a few good high-level insights from this method. Often, I never would have made those connections or realizations without the video review.
UserTesting.com is the well-known video research service, but their pricing tiers can be staggering for small companies. They do offer Peek.UserTesting.com, a briefer free version of the full service. Other services available at reasonable costs include TryMyUI and UserBrain.
One opportunity we often fail to take advantage of is the satisfied (or dissatisfied) customer. Do you have a checkout process in which you collect emails from users who purchase your product? Then you’re sitting on a gold mine, my friend.
First, create a simple survey, using Greg Ciotti’s ten really great tips on how to build a thoughtful survey. Once you have a survey that doesn’t lead or shape the answers, send it to anyone who has made a purchase in the last 30 days (Amazon does this really well). You’ll be surprised how well a thoughtful survey is received—you can expect great insights back on how your checkout process works, how your shopping experience feels, and what the customer wasn’t satisfied with.
This works anywhere you collect customer contact information. Ideally, send your customers an email or text message (be sure to incorporate an SMS messaging disclaimer during the checkout process) with the survey link. While you may also have access to their phone numbers, I’d be careful about calling people to ask them to take a survey. There are plenty of free services to customize and format these surveys, such as Survey Monkey, Qualtrics, and PollEverywhere.com.
One-Click User Feedback
En media res means “in the middle of the action.” A one-click user survey asks users’ opinions while they’re performing the task or digesting information. This kind of user testing takes a little upfront work to code and track, but can be incredibly valuable when it’s set up. I’m sure everyone is familiar with this feature on Amazon product reviews:
Example of Amazon’s “Was this review helpful?”
This is one-click user feedback, and there’s no reason it should be limited to just reviews or Troubleshooting FAQs. For instance, imagine a site where the newsletter sign-up box is tucked away in the upper right hand corner. Every time an email is submitted to the newsletter list, a “thank you” message pops up and a checkbox that asks, “Did you have trouble finding the newsletter sign-up on our site?”
Or consider a different phrasing: “Thanks for signing up! Help us improve: did you have trouble finding this box?” It never hurts to explain why you’re asking. This kind of supporting information rewards the user and helps them feel like they did a good deed for the day. And we all enjoy feeling useful.
Keeping questions to a single click reduces the user’s cognitive load, making them more inclined to participate. And if you catch them immediately after they’ve chosen to act, they may feel like it’s the final step of the process and do it willingly, without even realizing they’re helping you out.
Bonus Tactic: Blank Slate Designing
Unlike the other, more traditional user tests we’ve discussed, blank slate designing has the user coming up with the questions. For businesses with direct access to their clients, this user testing method is incredibly valuable. Instead of asking users what’s wrong with your product or offering’s current design, have them design a system with you. Bring users into your office or a local shared meeting space and compensate them for their time (this can be as simple as free coffee and bagels, a $20 gift card to a local sandwich shop, or a catered lunch).
Ask your users questions about your products or offerings, what they’d love to see added and why. Give them the tools to draw or model for you the features they wish you offered. In the case of tangible products, maybe even offer them scissors, tape, glue, needle and thread, and let them modify your product to meet their needs. Ask why a lot: Why did you put this here instead of there? Why do you like this color? Why would you get rid of this feature? To improve their concentration and your engagement, record the whole session on audio or video so your hands are free to jump in and interact with your users’ ideas.
Testing for all
User testing doesn’t have to be a highly sophisticated machine run by a CRO team. It’s as flexible as your annual budget and as creative as you can stretch yourself to be. Usability testing can be as simple as reading the complaint email Myrna from Upstate New York sent in regards to your patent-pending bagel maker and caring enough to take her suggestions seriously. It just takes a little empathy, and recognition that you are not your user. So take their advice—whether it’s explicit or deduced—and do what you do better!