Surveys seem to be a constant in life at the moment! Like the old quote about the only certainties being death and taxes—it seems that everyone (and their dog) would like you to just fill in this quick survey or answer a short series of questions! As enablers of change, surveys are an important part of our toolkit. So just what do we need to know when designing a survey? How can we be sure we will collect the data we need? In this episode we’ll explore how to design a decent survey.
There are a few key steps to designing a decent survey. The first step is to make sure we’re clear about the survey context and purpose. This means we need to be really clear about the purpose of the survey, who it is for and what we are looking for in the results. The second step is to determine the target population and sample size for the survey. This is being able to articulate who we are surveying by being able to describe what the target population we are trying to survey actually looks like and how many responses we want. The third step is to decide the questions and do a pre-test, and only then can we finally distribute the survey. Of course, the actual final two steps are analysing the results and then presenting the findings to whoever wanted the survey in the first place, but we are focusing on the first three steps at this stage.
We wanted to highlight that designing the questions is actually step three! We know this is a bit like designing an extension program. We tend to jump to the fun part—running the extension activities, or in this case, designing the survey questions—without considering the first two steps which are critical to the survey actually being useful! So now we will go into these steps in a bit more detail.
The first step is to be clear about the survey context and its purpose. Ideally we will have spent time designing our project and have an evaluation framework developed; both which we have covered in earlier episodes. This means we will know what data we are looking for and who the survey is for. Sometimes though, we are called in to help design a survey and this preparatory work hasn’t been done. In this case we suggest sitting down and working through this with the project team, or whoever has asked for the survey, to get all the background we can before we head to the next step.
The second step is to determine the target population and sample size for the survey. This is where we want to be able to describe the target population we are trying to survey. Are we able to contact all of the population or will we need to try and representatively sample the population? At this point we need to ask how accurate we want the survey to be. The more people we survey, the more representative the data we collect will be—but how much money and time do we have? If we think about a census-type survey, undertaken by the government in both Australia and New Zealand every few years, then we will need lots of time and other resources to get to 100% of the population (or as close to that as possible!). But in our work that is seldom possible. Do you have the contact details for 100% of the population? Maybe not, so we need to figure out how to get a representative sample.
To help work out the number of surveys we need, there’s a handy sample size calculator on the SurveyMonkey website. We just need to select our population size, the desired confidence level and the margin of error we’re willing to accept; and the clever monkees calculate the required sample size. As an example, if our population size is 220 growers, and we want 95% confidence that our sample reflects the overall population, and we are willing to accept a margin of error of 10%, then the sample size required is 68! However, if we were able to get 100 responses, the margin of error would drop to 7%. To get down to 5%, we would need 141 responses. Our rule of thumb is to aim to receive at least 100 responses that we can then analyse, as that is often representative of the population we’re sampling.
The last thing we need to consider for this step is how we are going to administer the survey. There are a few different ways we can do this. We can survey via the web and email; we can do a mail survey, a phone survey, or we could even survey face-to-face. The response rates are different depending on the way in which we do the survey. Typically online and email surveys have low response rates, maybe 5 to 15% depending on the population we are surveying. This may not be a problem, it just means that we might need lots more contacts to get the numbers of surveys we need for the representativeness we are aiming for. Mail surveys can also have low response rates, anything from 10 to 30%. Phone surveys are better and of course face-to-face response rates can be higher again. But, time and resources are generally a constraint. As a rule of thumb, based on our and others’ experience, you should conduct phone surveys with farmers, but use other survey options for rural professionals like vets and agribusiness suppliers.
All of this discussion is important because of the non-responder bias! That is the difference between those who complete our survey, compared to those who do not. There will always be some bias, and it is just the fun part of working with humans! But we can figure out how big that bias is by checking with some non-responders. In practice this means doing things such as phoning a small sample of people who did not respond to an email or mail survey and doing the survey with them to determine whether they are very different from those who did respond. Another way of checking this could be to compare the demographics of responders with the demographics of the population as a whole (if this is available). For example, if the average age of survey respondents is 32, and the average age of the population as a whole is 51, then you could deduce that you have a biased sample. As a result, we might not be discovering the true needs of older members of the population.
It is also worth mentioning incentives at this point. Offering an incentive to fill out a survey could make a difference. We are not going to explore this detail now, but we will do an episode on this in future as there is some great research out there investigating the role of incentives for increasing survey responses. At this point, we think it is important to consider whether we can offer an incentive to complete the survey.
Now, (at last!) the final step is to start designing the questions—the bit we were really wanting to get our teeth into! When we get to this part of designing a survey, because we have been through the first two steps, we will know what we want to ask. We will look at the change we were aiming for and our evaluation framework. If we have been aiming for KASA change (that is, knowledge, attitude, skills and aspirational change), we will not ask questions about practice change. However, if the outcome for our project was practice change, then we will ask specific questions about this. Finally, we need to do a pre-test of the survey with some friendly respondents. This tests whether the questions make sense to them, and secondly, it allows us to test the analysis of the data collected from these questions. Please do not forget this important step!
There are a few other useful rules of thumb when designing a survey. Here are a few we think are worth considering.
- Do not ask too many questions! Use the minimum possible number of questions we need to collect the information required. A rule of thumb is to keep to less than ten questions.
- Decide whether the survey can be anonymous. Do we really need to track people’s response throughout a project—if yes, then we might need to ask for names or get them to nominate a code word they use for any subsequent surveys. And if we do need to collect names, we will ask for this at the end of the survey and explain why we need this! If we do not need to track people, then we can make the survey anonymous.
- Use preset answers where possible, like having a list of multichoice responses, although we will always add the option for respondents to provide their own answer if they want. But make it as quick and easy as possible for someone to fill in the survey.
- Think about the scales to use. We have found that people tend to use either a 1 to 5 or a 1 to 10 scale for responses. Either will work, although scales of 1 to 10 provide greater sensitivity of responses. But a 1 to 5 scale could be easier for respondents to fill in. There does not appear to be a hard or fast rule on this so consistency is the key here!
- Demographics… when it comes to demographics, be ruthless about what is really needed. Often these questions are added into surveys without a lot of thought about how they will be used. Only ask a question if the answer will be used!
- We think that adding a final open ended question at the end of a survey inviting respondents to add anything else they would like to tell us, is really helpful. Although many respondents will not have anything else to add, sometimes the answers to that question can contain some useful information that can inform the next survey or provide a prompt to follow up on a particular aspect of the project in more detail.
- Do not forget that a key part of the survey is the introduction to the survey! The little preamble that explains what we’re doing and why, and outlines whether it is anonymous or not. This is important… use manners and be polite!
- And finally, it is a useful exercise to challenge ourselves with the following: if we could only ask one question, what would that question be? This gets to the heart of what we’re designing the survey for and can help us decide on the critical questions.
So, designing a decent survey involves being clear about the survey context and purpose, defining the target population and sample size, then deciding on the questions and conducting a pre-test. Following this process means we will have a great survey to distribute and data that we can confidently analyse to obtain the information we need.
To finish, we want to say thanks to Jeff Coutts, as we have been heavily influenced by a webinar he did with John a few years ago on Designing effective surveys in three easy steps. Jeff is a bit of a legend and has a huge amount of knowledge that we have drawn on for this episode. Thank you Jeff!
As always, we do not want this just to be a one way conversation, so please add any survey tips you have below this post. We enjoy reading and responding to your comments—thanks to those who take the time to respond with their thoughts and ideas!
Thanks for reading this enablers of change blog post! Remember to subscribe if you would like to know when new episodes are available, and send us an email or use the comment section below if you have a topic you’d like us to explore.