Why Surveys Alone Can Mislead: A Cautionary Tale from the Field
When the Data Doesn’t Tell the Whole Story
Not too long ago I was analysing some survey data. The survey was designed to help a team understand what a typical process might look like for quite a specific B2B audience.
The data suggested that specific tools and resources were key to how they navigated that process, but when asked about this in more detail in user interviews, the reality seemed to be very different. Those resources were indeed important, but only at specific points and for very specific purposes.
Had the team acted on the initial data and insight, they’d probably have ended up focusing on something that would be high effort, but low reward.
The Problem with Predefined Options
What can be taken from this? First, there’s something that’s an issue with surveys in general, particularly the use of multiple-choice questions.
When we present a set of predefined options to a user, we’ve already influenced and narrowed their frame of reference and led them down a certain path.
Why Conversations Still Matter
Secondly, conversations, while messy, are critical to digging deep into any hunches or questions you might have on data you’ve already gathered.
Use Mixed Methods to See the Full Picture
If you don’t include a range of methods in your research, chances are, you’re not getting the full picture. Or at least, as full a picture as you can get in a user research setting.