It’s important to find out about your customers from time to time and forms are often at the heart of this - with surveys being a popular choice. However, there are various pitfalls that it’s all too easy to fall into. One of the major problems with surveys is called nonresponse bias.
As you might assume from the name, nonresponse bias is all about the misleading results that accrue when some of those surveyed don’t give their responses. There are lots of reasons why this might happen. We’ll go into some of the most common ones, before suggesting ways you can avoid it happening in your online survey.
Nonresponse bias has something to do with not getting responses to part or all of your survey. There’s a little more to it than that, however. It’s the systematic exclusion of a given demographic or personality type from being able or likely to answer a particular question or whole survey.
Let’s say you want to find out if a distribution agreement template that you’ve created is easy to understand by the population as a whole. If you send it just to university professionals, you’re excluding responses from the vast majority of the populace.
Another form of nonresponse bias can simply be a deterrence against giving a particular answer, because to do so would, in the mind of the respondent, characterize them as guilty of something.
A classic example is a survey that asks the respondents to answer yes or no to having committed a crime. Let’s say the survey enquires if you have ever shoplifted. There may be anonymity built into the questionnaire so that the respondent can be reasonably confident that their answers won’t be attributable to them.
However, this counts for very little against the fear in the back of their mind that the truth could emerge and they’ll be branded a committed thief for stealing gum from a machine when they were 9 years old. The result is that a significant number of people will not answer this.
They won’t necessarily want to lie and say they’ve never stolen. They just won’t want to engage with the survey. Perhaps unsurprisingly, this phenomenon is observed much more when the survey is being administered on behalf of the government.
Consequently, you lose that individual’s data. This is damaging in itself - every individual’s data is a precious marketing resource. But it’s also damaging in a macro sense. Your survey’s validity depends upon its input from a reasonably representative sample of people.
This case of nonresponse bias will leave you with a sample made up of people who have either never stolen anything or are happy to talk about what they have stolen. This will skew your results.
The key characteristic of nonresponse bias is that it has to have a significant effect on the range of answers. If the survey systematically discourages responses from a particular and relevant group, then you have nonresponse bias.
As we’ve seen, nonresponse bias can be caused by questions that perhaps put the respondent in a morally difficult position. They don’t want to confess to their misdemeanor, but they don’t want to lie either, so they withdraw from the survey altogether.
Another example could be a survey that includes sensitive questions that ask people to describe themselves as obese or not. There are very few people that will happily characterize themselves as obese, but they might not want to give misleading information, so they disqualify themselves from the survey altogether.
Again, bad news for the survey facilitator, who just wants some accurate information. What they end up with is a suspiciously slim average respondent.
As a survey facilitator, you have to think about how you’re going to solicit data from the right people. This means that the time, the place, and the means have to chime with your target population. If you want to get responses from 9 to 5 office workers and you send face-to-face surveyors into shopping malls at 11 am on a weekday, you’re not likely to get much input from people in your target group.
So, how we conduct the survey influences what kind of person responds. If you want responses to include a younger demographic, you should probably avoid a postal or landline survey. Not to generalize too much, but you will probably get a more fruitful patch of respondents if you go via TikTok.
Another invitation problem can stem from failed deliveries. This is of course a significant problem with emails, due to the overwhelming number of the things that people have to wade through every day. Spam filters take out a good deal of the ones we don’t need to see, but among them often languish surveys galore.
When you're considering the demographics of your target audience, remember that the way you conduct your survey influences what kind of person responds. If you want to target a younger demographic, for example, you should consider posting your survey on social media platforms or entry level job sites. These channels will help you get your survey in front of the right people and increase the likelihood of a more balanced sample.
A key requirement of a survey is to be to the point. Nobody wants to waste time with preamble or wordy questions. This is why the abandonment rate with surveys is closely linked to the number of questions being asked.
Those with more spare time are going to be less likely to feel the urge to abandon a survey. This means that a longer survey requiring more of the respondent’s time will tend to feature a higher level of response from time-rich people rather than time-poor.
Sometimes, people abandon surveys not necessarily because the questions take too long to answer. They simply don’t understand them, and any help facility is inadequate or poorly signed. It’s a given that people don’t like to be made to feel unintelligent, so they won’t stick around.
Of course, the reason they don’t understand the questions could be that they’re poorly worded. You can work to improve your survey’s accessibility, which will help with securing a balanced sample.
A notable exception to this kind of approach is the industry survey, which will by necessity have some technical terminology that won’t be understood by the majority. This is OK if you’re just interested in the views of those familiar with the subject in the first place. For instance, you may have a JPG to PDF converter and you want to discover what business users are feeling about it. You don’t need to explain what it does: it’s a fair assumption that users will know this.
What are you trying to find out? Just as crucially, who are you trying to find out about? A good way of zeroing in on this is to use buyer personas. They help focus your thoughts on likely customers, giving you an accurate envisioning of your target group.
This is important. No point in having a beautifully designed, valid, and reliable survey that’s engaging people who will never be customers of yours. While you’re finding out about people who want nothing to do with you, you’re ignoring your core demographic. This is a serious case of nonresponse bias.
Ensure that your survey contains some demographic capture element. This way, you can monitor the make-up of your sample. You can then apply a remedy in a timely fashion.
You can also track respondents to an email survey to see who’s responding, and what approach at what time seems to work best. You can also identify where survey abandonment seems to be taking place. Again, then act accordingly to balance up your sample.
We’ve seen how people might react to being asked sensitive questions. Rather than losing that person altogether, permit them to duck questions and move on to the next one.
If you need a wide variety of respondents, use a variety of invitation techniques. If you want to find out how easily used an agent agreement template is by a number of different demographics, then you’ll need to use a battery of different approaches.
Don’t just send out a survey and assume people will fall over each other to complete it. It has to look enticing. What does this mean? Well, unless you can genuinely say that your subject matter is electrifying, you’re probably going to need to appeal to a person’s self-interest.
This might mean promising some material reward, like entry into a draw, or a discount voucher at a shop they’ve indicated an existing preference for. Or it could be something along the lines of ‘Ten questions that will reveal just how the opposite sex feels about you’, which may be collecting serious data but is actually disguised as a bit of fun.
Or it could be something really arrestingly worded. For instance, I had an email this morning come in with ‘Watch out!’ in the subject line. It did grab me. OK, when I saw it was from an insurer I soon found something else to do, but it had me for an instant.
Another way to secure engagement is to give the respondent reassurance that the entire survey operation is legitimate. You can use trust badges to help with this.
There’s so much valuable information out there, it’s a resource that can make marketers feel giddy with opportunity. However, as we’ve seen, it’s all too easy to end up with results that are misleading due to sample skew. And there’s only one thing worse than no information, and that’s the wrong information.
Thankfully, there are means to go about protecting against this, and it’s down to you to build these safeguards in. That way, you can be reasonably sure that your expensive and time-consuming survey is as valuable as it should be. Now go get that data!
Zuko is the most powerful form analytics platform available on the market. Find out how to improve your form and checkout conversion by taking a product tour.
PRODUCT TOUR