
When was the last survey invitation you received? Or maybe a more relevant question, how many survey invitations have you received in the last week? Personally, I am literally unable to count that number. This might be at least partly because I delete most of them without thinking twice. Why is that my default reaction though? Four reasons come to mind:
Time: I just do not have the time to do another survey! How would I know that though, especially if I haven’t even opened this one? Maybe it is a brief or easy survey. Unfortunately, we have all started and abandoned far too many surveys that just go on endlessly.
Irrelevance: I just do not have a strong enough opinion about the subject. Potentially I might be willing to do this survey, but the less relevant it is to me the greater incentive I would need to even consider it. And usually an incentive would have to be meaningful to be at all motivating. 5% off of something is not very motivating. Often, learning the results of the survey is just as meaningless.
Not interesting: Too often, this is just not a compelling topic for me.
Incomprehensibility: Too many surveys contain questions that are poorly worded, or that require more than a simple answer except they are forcing me into a yes/no response or a simple Likert scale that doesn’t make sense to me.
As a result, part of the problem we create for ourselves is that we only ever hear from the most upset people, the ones who will take the time to make sure their voice is heard!
And to be sure, we are part of the problem. Assessment and accountability are critical for all of us, so responsibly, we want to understand the impact our efforts are having. Unfortunately, it is never as simple as counting the number of sales or enrollments or responses, because so many things go into that “final” number that it is usually not possible to ascribe the end result to any one given thing. Plus we know from the field of behavioral economics that we likely cannot ask people directly what we want to know if we want a true answer because people are not rational and are likely deluding themselves.
So instead, we are looking to understand the impact (literally, just the impact) that our initiative has had on the decision of our target. What do we do? We ask them what they thought about what we tried, and the cheapest and most efficient way to do that is via a survey. If we play the law of big numbers, we can send out an invitation to literally everyone that might be part of the universe of influence and figure out later how to draw any conclusions that can be extrapolated from a tiny response rate.
So unfortunately, surveys remain an important and valuable tool. The term “survey fatigue” is too generous though. We are not just fatigued from taking surveys, we are lethargic about them, or “lacking energy or enthusiasm.” The question then is how you can improve your response rate and get valuable feedback.
There are some easy answers. You can pay for a higher response rate via incentives. This is why natural incentives (such as “results of the final study”) are so attractive to survey-givers, because they are free to provide. Usually though, incentives are budget-prohibitive. Otherwise incentives can undermine your survey, as in when you offer something so small or meaningless that instead of declining to participate, your survey invitees will be even more annoyed or offended. Alternatively, sometimes you can mandate survey response, particularly if you have a captive audience that you control.
The better answer however is to WRITE BETTER SURVEYS. This takes real work on our part. A lazy survey asks a hundred things, trying to find out every aspect of something that we could possibly know. There is undoubtedly value in this because the more we know the better we can make decisions, but far more often than not we are asking for additional information that is not genuinely helpful to us, thus undermining survey completion. A better survey, a truly effective survey, does the pre-work necessary to really define the question: WHAT IS THE THING THAT WE WANT TO KNOW? Not: What are all the feelings that you had at each stage of this process? We probably do want to know that too, but it is ineffective and unproductive to try to capture all of this all at once anyway.
We want to know: How did you find out about us? (Not the lazy form: “Share your opinion on a Likert scale of the following dozen media options!”) We want to know: What can we do better? (Not the lazy form: “How was your experience?” Answer: Fine.) We want to know: Why did you buy? (Not the lazy form: “How did you feel about each and every step of the process?”)
A good survey is hyper-focused on the single thing we want to know, the one question we’re trying to answer. Taking a cue from the concept of Essentialism captured so well by Greg McKeown, we can do this the following way:
Determine that one thing.
Strip everything else away (even if, especially if, we also want to know that other stuff, there is a better and more effective way, i.e., non-confounding way, to ask that elsewhere or elsewhen).
Ask that one thing in a creative way that allows and compels the respondent to answer authentically.
Consider a classic example from the world of college search: the Admitted Student Survey. This is the survey we send to students who we have admitted, not all of whom will enroll, and there are two versions:
Admitted-not-enrolled:
Why didn’t you take our offer?
Where did you go instead?
How could we have convinced you to come?
PLEASE LIKE ME!
Admitted-enrolled
What made you pick us?
SO THAT I CAN DO THAT FOR EVERYONE ELSE SO THEY WILL ALSO COME TOO (except, in question form)
As anyone who has been involved year after year with these surveys will tell you, the problem is that the consistent answer to both questions is BECAUSE MONEY:
Admitted-not-enrolled
“I couldn’t afford it.”
“You didn’t give me enough money.”
“Another school gave me more money.”
Admitted-enrolled
“You gave me a huge scholarship.”
“You’re so cheap I could afford it.”
NONE OF THESE ANSWERS ARE HELPFUL!
We know we have limited resources and that we cannot give people as much financial aid as we want to. We know that the scholarship landscape is incredibly competitive and other schools are giving what looks like more money, even when the bottom-line final cost is still more than you might pay at “my” school.
We compound the issue when we effectively say: “Hey while we have your attention, let me ask you about a bunch of other things that I want to know also, even though I want to know them less, but maybe these can help me too...”
Do we have the program you want?
How was your campus visit?
Did you sit in on a class or meet with a faculty member?
Did you like our food?
Do you like our mascot?
And on and on and on (and on and on)...
The amount of actual and actionable information these other questions give me that actually helps me understand why you did or did not choose my school is very small
By way of concrete illustration, consider the incessant companion of the “why-didn’t-you-choose-us” survey question: “Where are you going instead?” Really, does that matter? The question of our primary competition is important too, but not truly relevant for the question here at hand, which is why you did not choose my offer. So that question inevitably lengthens and complicates the survey. In fact, if I am a student and now I ALSO have to tell you where I will be going instead, that could very well have an influence on my willingness to tell you why I chose not to come (or at least to be honest and forthright).
So the classic problem is, how can I ask you why you came or did not come without receiving a default answer of “money”?
Consider this answer:
Obviously, your decision to come or not to come is the combination of many factors. The core of this question is what is the primary factor that influenced your decision? It is in knowing and addressing that primary factor that we can mature as an organization, and if we do that successfully then there will always be a new factor that will emerge as the most pressing issue we can explore:
The admitted-not-enrolled version:
You did not choose my school because it was better for you to go somewhere else. What I want to know is why that was. It is enough for me to know the thing that made you least keen on coming here. This is the negative frame for the “biggest influencing factor” question because you are in the “did not enroll” population. I can simply ask you about the most frustrating part of your experience. Your brief answer to that alone will give me ample information. If I can get a majority of my survey population to tell me that, now I have statistical significance!
The admitted-enrolled version:
Similarly, your decision to come was because you identified this as the best place for yourself. What I want to know is why that was. Probably there were a lot of things, but what was the primary thing? This is the positive frame for the “biggest influencing factor” question because you are in the “did enroll” population. If I can identify that, and do it among a large response rate, I can start to see some powerful and actionable patterns. I can simply ask you about the most rewarding part of your experience. It bears repeating: Your answer to that alone will give me ample information, and if I can get a majority of my survey population to tell me that, now I have statistical significance.
Through this kind of approach, I can ask a single-question survey that is easy to respond to, that people will want to respond to, that will therefore have a high response rate, and that will tell me a ton of information. Ironically, this is less information than the data on all of the points I might have asked for in a 40-point or 100-point survey previously; but this is more information that I can actually use. And actually, I am not giving up all that much because almost all of our surveys are cyclical or iterative, so we will have another opportunity in the very next cycle to investigate again.
Think of it like a game. Am I willing to adopt the challenge of figuring out what is the single most important thing for me to know? Or, what is the absolute simplest way that I can ask it? Coincidentally, my need to motivate and incentivize and remind and pester people to answer my survey will go way down while simultaneously my response rate will go way up. And more importantly, I will actually acquire valuable information.