
Here you are, running another Customer Validation (CV) survey. Your survey is packed with questions designed to yield meaningful product insights, such as “What issues are you encountering?” and “What do you like most about the product?” You’ve scanned for signs of psychological bias. You’ve recruited targeted survey-takers, participation is high, and feedback is rolling in. You’re feeling pretty good.
Fast forward a few days to the end of the survey. Now, you’re slogging through mountains of data, reading response after response. Delivering definitive results requires hours of follow-up with dozens of testers. You did everything to manage your survey efficiently, so why is analyzing the data taking so much time?
In traveling across the country to coach teams on enhancing their CV processes, Centercode Product Manager Chris Rader sees a lot of surveys. The challenge he spots most often isn’t poor survey management.
“It’s not that these teams are managing their surveys inefficiently,” says Chris. “It’s that it’s inefficient to use surveys as your primary mode of feedback.”
That’s right. The biggest reason for survey management time sinks is using surveys in the first place.
How Surveys Cause Friction During CV
Chris elaborates: “A lot of the friction CV professionals encounter when collecting tester feedback comes from using surveys as a feedback catch-all. Technically, you can use a survey to collect issues, ideas, and praise from testers — the same way you could save a phone number in a word processor. But when you actually go to call someone, you need to transfer that information from one tool to another.”
He explained that the same thing happens with surveys. “You can ask open-ended questions like, ‘What issues did you encounter?’ but you’ll spend a lot of time reviewing feedback that your testers could have entered directly into your feedback management system,” says Chris. “It doesn’t matter how well-crafted your survey is at that point; feedback collection just won’t be as efficient as it could be.”
Surveys vs. Feedback Forms
Relying predominantly on surveys can also damage the accuracy of your data because your testers aren’t reporting issues in real-time. Say your testers respond to their weekly survey, then run into a bug two days later. They may not report that issue until the next survey — five days after the initial encounter.
“Surveys offer a snapshot of a single point in time,” says Chris. “If your testers are only reporting once a week, you risk losing the details, impressions, and sequence of events that shape what happened. It’s the details that come out in real-time reporting that give you an accurate portrait of the user experience — and make CV so valuable.”
Instead, Chris recommends training your testers to use feedback forms. Ongoing collection methods like these forms let you capture and respond to feedback as it flows in, so your team can get the jump on product fixes. Efficiency doubles if you’re using a feature like smart issue prioritization to surface your most impactful feedback. Either way, the immediacy of ongoing feedback channels turns tester follow-up into a dynamic conversation.
Using dedicated issues, ideas, and praise forms also has the time-saving effect of keeping your feedback flowing through a single channel. This prevents your CV project manager from spending time scrubbing duplicates, triaging mountains of data, and manually sorting all of your data.
So, When Should You Use a Survey?
“Surveys work best when you want to collect a large sample of data measuring your customers’ attitudes,” says Sabrina Solis, one of Centercode’s researchers. “They’re a perfect complement to other CV feedback channels, like issues, ideas, and praise, because the data provided by surveys puts that feedback in context.”

On the flip side, Sabrina warns against using surveys to collect usability information since they don’t produce reliable behavioral data. “People are typically unable to self-report their own behavior with accuracy,” she says. “Your product analytics tell a more conclusive story here.”
Similarly, survey responses don’t provide enough context to answer probing “why” questions. But they can point you in the right direction. “Surveys reveal opportunities to gather in-depth, qualitative insights. But you don’t necessarily need or want those insights from every single one of your testers — they take time to process. One of the many cool things about survey data is that the trends in customer attitudes it reveals can be a jumping-off point for deeper, more targeted product exploration.”
Before Building Your Survey, Ask Yourself…
Both Chris and Sabrina said the same thing: “First, ask yourself, ‘What am I trying to learn?’ Then identify the most effective way to learn that information.”
It’s common for your stakeholders to see surveys as the most effective way to answer all of the product questions your company has. But getting the depth of information you need to fully achieve your CV objectives — and not spending all your time triaging your feedback — means refining your processes so that they work for you, and not the other way around.
Learn the most effective strategies for collecting Customer Validation feedback with the best practices in The Feedback Playbook.


