Master surveys that drive meaningful feedback in user testing—download the new ebook with expert tips and best practices now!
Test Management

Our Top 8 Survey Best Practices for Your Next Beta Test

January 15, 2016

Every experienced beta manager knows how useful a well-crafted survey can be to the success of a beta test. Where poorly-written surveys provide useless or misleading information and overly complex surveys burn out testers, well-written surveys can have the opposite effect — helping you collect the high-quality feedback you’re looking for. In an effort to help you maximize the effectiveness of your surveys, this post identifies eight survey best practices you can implement during your next beta test.

1. Keep surveys quick and focused

In most scenarios, testers are volunteering their time and energy — respect that. Generally, 10 questions is a good survey, 15 is long but acceptable, and 20 or more is only really appropriate at the end of a beta test (since you won’t be asking for much more afterward). If you plan to survey your testers more than once a week, keep them to around five questions each. Before you start writing your survey, ask yourself “What do I want to know?” Focus on gathering the data you need to answer your question and avoid adding in a bunch of “nice to know” questions that will just make your survey longer and more tedious.

2. Determine the target audience

Not every survey needs to go to every tester. Maybe you only want testers who are tech-savvy to answer your survey, or need the opinions of testers who have successfully used a certain feature. Prodding testers about everything could cloud your data with irrelevant responses.

3. Remove bias and confusion

How you ask a question makes a big difference in how useful your data is. When writing your questions, make sure you aren’t including leading language (e.g. “How easy was the product to use?”) or asking multiple things in a single question (e.g. “Rate the intuitiveness of the hardware’s setup and use.”).

4. Keep questions short and words simple

The shorter your questions are, the easier they’ll be for your testers to understand and answer. It will also be easier for you when you’re creating graphs and reports. If your questions are longer than one line, consider rewording if you’re trying to cover too much in the question.

5. Think about how you want to use the data

What question are you trying to answer? Do you need to be able to compare the responses to each other or to a baseline? Do you want to know which device testers primarily use to watch movies, or if they use any of the devices listed? Small wording changes can make a big difference, so make sure the questions are collecting the data you really need in a way you can use.

6. Use rating scales of 5 (not 10)

Although common, there is no reason rating scales need to be from 1 to 10. Rating scales with five points are much easier for both testers and your team. A 5-point rating scale allows room for strong feelings (1 and 5), general good or bad feelings (2 and 4), as well as indifference (3). This makes selecting choices more natural and obvious, while also making reporting easier and cleaner.

7. Label your rating scales appropriately

Rating scales are useful in nearly every survey. Unfortunately, many surveys have unmarked values (such as 1, 2, 3, 4, 5) which can be interpreted differently by every tester. By giving labels to the first and last values (such as 1=Strongly Disagree, 5=Strongly Agree), testers are given a clearer picture of what the values are intended to represent. Also, make sure your labels are appropriate and make sense with the question. A scale of Terrible to Okay isn’t balanced, because the positive rating isn’t strong enough. Also, a scale of Poor to Excellent doesn’t make sense if the question is “How likely are you to buy this product?”

8. Don’t pre-fill the answers

Don’t start your survey with ratings already selected (even if the pre-filled answer is the neutral choice or N/A). Testers will be more likely to leave the question with the pre-filled answer, which leads to inaccurate results.

By adopting these survey best practices, you are much more likely to collect the feedback needed to improve your product. Aside from collecting product feedback itself, you can better utilize your testers’ energy levels by crafting efficient surveys, which will give testers more time to submit other types of feedback. If you’re looking to improve your survey writing skills, these best practices are for you. If you want to know more about surveys, check out The Feedback Playbook.

Download The Feedback Playbook Now!

No items found.