Skip to content

Reducing our biases

How many biases do you think we have? Whether we like to admit it or not, we all have biases that affect aspects of our lives and the lives of others with whom we interact—but it turns out there are ways to at least minimise our bias. This is important for us as enablers of change because we are often working people in a professional setting, and without knowing it we may be letting our rules of thumb incorrectly guide us. We also use surveys a lot (we have talked about how to use them in a previous episode). Unfortunately though, our bias can affect the results and our interpretation of them.

Did you know there are over two dozen different ways that we’re biased? Let’s start by talking about our conscious bias—the things we openly know that we are biased about. John says that he much prefers dark chocolate to milk chocolate. John says it is all about the taste, in that he likes the slightly bitter but rich taste of dark chocolate, while milk chocolate is often just too sweet for him. So if John is shopping, he will be biased towards buying a rich, dark chocolate. Is that wrong? No, because it really does not really negatively affect other people, and at least he is aware of it! And Denise happens to share that bias although confesses that as long as it is good quality (i.e. high cocoa solid content) then she will happily eat milk chocolate!

However, it is unconscious biases we need to be more careful about—the ones that unknowingly affect our decisions and actions. The ones we do not realise we have. These cognitive biases are shortcuts in our thinking—which are often really useful—but that can backfire on us! 

Just to get you thinking about this, here is how our unconscious bias can take effect. We recently celebrated international women’s day, so it’s appropriate to begin with a common bias—gender bias. This is where we have a mental model of the type of role a person of a particular gender most commonly undertakes. If we are in a hospital waiting room and we are told a doctor will see us shortly, and a male and a female enter the room both wearing hospital scrubs, some would naturally assume the doctor was the male. And in the same situation, if we were told a nurse will come and change our bandages soon, some would expect the female to undertake this role. Of course these assumptions are wrong, and if we questioned the person making these assumptions, they would probably freely admit that doctors and nurses can be male or female. Yet we jump to that gendered assumption because it is quick and easy, and we did not have to think about it.

We used gender as the last example, but it could be age, appearance, religion, weight, or even someone’s accent—the list goes on and on. The important first step is to admit that we have these biases, otherwise we will have a blindspot that will affect our decisions and actions. We are human and we all make assumptions that make us less rational and less impartial. 

Back in 2019 at the APEN conference in Darwin, one of the keynote speakers was Peter Ellerton. Peter is a lecturer at the University of Queensland in critical thinking and is the founding director of the Critical Thinking Project. His talk was really interesting and he highlighted a neat website that provides details of 24 common biases that affect our thinking. We thought it would be worth highlighting a few of these as part of this episode! 

First up is confirmation bias. This occurs because we tend to favour information that confirms our existing beliefs! In fact, we can often actively seek and remember information that confirms our existing perceptions. As an example, given we have recently been talking about designing a decent survey, confirmation bias when developing a survey could lead us to just use a multi-choice question that lists the responses we expect. Instead, we could use an open-ended question with a comment field. This does mean a bit more work when analysing responses, but it allows those responding to add different beliefs or ideas. A good compromise we have found is to list the responses we expect will be most common in a multi-choice question, but then add a comment field where the respondents can add in their own responses. 

A related one is belief bias, where we are more open to accepting information that supports our existing beliefs, but resist considering the merits of evidence that contradicts our beliefs. Using surveys as an example again, this could mean we will tend to believe the survey results that provide evidence supporting our beliefs and may overlook comments that are not supporting our position.

Next up is the recency bias where what we think is relevant is heavily influenced by our most recent experience or memories. The best way of countering this bias? Look for data! We think that is a key job for enablers of change! But then we should be careful when we get that data… anchoring bias can occur where the first bit of information we receive and judge then determines our judgement of any subsequent information. There are many others that are referenced on the cognitive bias website so it’s worth a look!

However, let’s continue our focus on bias in surveys. Non-responser bias occurs when the answers of the survey respondents differ from the potential answers of those who did not respond. The non-respondents may be unwilling or unable to respond to the survey due to a factor that makes them different from those who did respond. An obvious example with online surveys is the lack of computer skills or access to the internet. We talked about a way to test this in a previous episode. Simply contact a sample of the non-respondents, in this case perhaps by telephone and ask them the survey questions and see whether their responses are significantly different from the original sample. 

We should also be aware of social desirability bias—the tendency of survey respondents to answer questions in a manner that will be viewed favourably by those reading them. This can lead to an over-reporting of the supposed good behaviour and under-reporting of the supposed bad behaviour. One way to reduce the likelihood of this bias is to use anonymous surveys. Another is to ask questions related to a group of people, for example “To what degree do you think your neighbours correctly separate the rubbish in their recycling bins every time they do it?”. 

Finally, another one to consider is the better-than-average bias or egocentric bias. This is where people are likely to rate themselves as better than others, whether it be their intelligence or sporting prowess. A recent piece of research from Kim et al. (2017) provided some useful explanation of this. They explored what people thought of as ‘average’ and found it was often interpreted as below-median ability. 

To illustrate this, one interesting study found that 93 percent of drivers rated themselves as better than average on the road (Svenson, 1981). In this paper the Swedish psychologist, Ola Svenson, asked American drivers to compare their skills with those of other drivers. Svenson’s results showed that 93 percent of them claimed their skill put them in the top half of all drivers. The way around this for us when devising surveys is to have clear definitions of what we mean. In this case, we might refer to the number of accidents they have had in the preceding five years, or the specific skill of reverse parallel parking.

So, in conclusion, let’s make an effort to be aware of our different biases, and how they might affect other people and the data we collect. It’s just another thing we need to think about to be effective enablers of change!  

Well, you have read our thoughts, now we would like to hear yours! Add a comment below and tell us about your experiences with biases, including any tips and further ideas about identifying and then reducing them. We do not want this to be just a one-way conversationjoin in by sharing your thoughts and ideas with us! 

Thanks folks for reading this Enablers of change blog post. Remember to subscribe to our newsletter if you would like to know when new episodes are available. And if you liked what you read, please tell your friends so they can join the conversation!


Kim, Y. H., Kwon, H., & Chiu, C. Y. (2017). The better-than-average effect is observed because “average” is often construed as below-median ability. Frontiers in psychology, 8, 898. 

Svenson, O. (1981). Are we all less risky and more skillful than our fellow drivers? Acta Psychologica, 47(2), 143–148.

0 0 votes
Article rating
Notify of
newest most voted
Inline feedbacks
View all comments
Graham Harris
Graham Harris
2 years ago

A great reminder for us all – unconscious bias can be a real trap in our R,D&E efforts. Thanks for the reminder.

Kamru (Md Kamruzzaman)
Kamru (Md Kamruzzaman)
2 years ago

-Glad to know a number of biases we do/have both consciously or unconsciously.
-Also, it is good to have an option “other (please mention)” in a multiple choice answer question in a Survey.
-Also, keeping open ended questions in Survey is worth.
-I would say mix quantitative and qualitative data collection methods to reduce biasness and cover broader perspective of the research question.

We would love to hear your thoughts, so please leave a comment!x