Human Centric helps organisations become more effective and inclusive through addressing bias. But what is bias? How is bias affecting our decision making and our lives (at work)? What can we (not) do to address bias? Volumes have been written about these questions, so we cannot give you a complete guide here. What we can do is give you a brief overview with some accessible reference materials.

What is bias?

The term “bias” is used in many contexts and in with many definitions. Here we would like to distinguish between implicit or unconscious bias and explicit bias. We also need to differentiate between a cognitive bias and a social bias.

Bias is defined as: “an inclination or prejudice for or against one person or group, especially in a way considered to be unfair.” If a person is aware of their bias, we can talk about an explicit bias. When a person is unaware of their bias we can talk about an implicit bias also referred to as unconscious bias. Here we focus on implicit or unconscious bias.

Cognitive biases are described as systematic patterns of deviation from norm and/or rationality in judgment. These are the ways in which the mind takes shortcuts to process information. Often we are unaware of these shortcuts. Therefore, they tend to be associated with unconscious or implicit bias. Confirmation bias, availability heuristic and stereotyping are examples of cognitive bias. 

Cognitive implicit bias, as well as explicit bias, can influence attitudes towards individuals or groups. When this happens we talk about social bias. Gender bias, racial bias, socio-economic bias, and ageism fall in this category.

It is important to note that everyone is biased. This is not a bad thing. It is a side-effect of our automatic decision making that is necessary for us to live and survive.

To understand bias better and the implications of bias in the real world, we start with the seminal work of Daniel Kahneman: Thinking, Fast and Slow. Then we will look at the Cognitive Bias Codex made by Buster Benson and will look at four examples of cognitive bias. The next section will look at the social aspect of bias, as we delve into the implications of bias. 

Thinking, Fast and Slow by Daniel Kahneman

How does cognitive bias occur? Kahneman introduces the concepts of system 1 (fast thinking) and system 2 (slow thinking). System 1 is fast, automatic, frequent, emotional, stereotypic, and unconscious. While system 2 is slow, effortful, infrequent, logical, calculating, conscious. 

Without system 1, we could not function because our conscious brain could not process all the information or make all the decision necessary and fast enough. Think for instance about braking in your car: you have taken action before your conscious brain even realises it. Yet, system 1 is vulnerable to cognitive biases and heuristics. In other words: we use shortcuts. These shortcuts impair our decision making, something we are often unaware of. Therefore, they are referred to as implicit or unconscious bias. 

To check whether system 1 is correct, we need to switch to system 2. However, that is a difficult thing to do because system 2 has limited capacity. We notice this, for instance, when we try to control our impulses (like eating chocolate) when system 2 is busy with something else entirely. We tend to rely on system 1 even more when we are stressed or in a hurry. So, we need to  become aware of our own biases and learn when to switch to system 2. 

Three examples the way system 1 tricks us

The cost of a ball riddle

See if you can answer this riddle: A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

Find the right answer in the Google Talk below.

(Mis)use of context

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Substituting questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Summary of Thinking Fast and Slow, by XXX

Google Talk by Daniel Kahneman on Thinking, Fast and Slow

The Cognitive Bias Codex by Buster Benson

There has been a lot of research on cognitive bias, particularly in the fields of psychology, and economics. Many biases have been identified and defined. So many, in fact, that it is difficult to structure them. There is no perfect way to do this, but it is a useful exercise, nonetheless.  

Here we present the cognitive bias codex put together by Buster Benson. He grouped the roughly 175 cognitive biases he found on WikiPedia around the problem they are trying to solve. It helped him understand why they exist, how they are useful (yes, they are also extremely useful), and the mental errors that they introduce. He defines four problems that our brain tries to solve. In solving these problems bias occurs.

The four problems as defined by Buster Benson

1. Too much information

There is just too much information in the world, we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.

2. Not enough meaning

The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world.

3. Need to act fast

We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.

4. What should we remember

There’s too much information in the universe. We can only afford to keep around the bits that are most likely to prove useful in the future. We need to make constant bets and trade-offs around what we try to remember and what we forget. For example, we prefer generalizations over specifics because they take up less space. When there are lots of irreducible details, we pick out a few standout items to save and discard the rest. What we save here is what is most likely to inform our filters related to problem 1’s information overload, as well as inform what comes to mind during the processes mentioned in problem 2 around filling in incomplete information. It’s all self-reinforcing.

Cognitive Bias Codex

Four examples of bias

Descriptions from Wikipedia, Buster Benson

Confirmation bias

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values. People tend to unconsciously select information that supports their views, but ignore non-supportive information. People also tend to interpret ambiguous evidence as supporting their existing position. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs.

Changing one’s opinion is difficult and tiresome – and a system 2 task. So, it is easier to look at the information that confirms your existing opinion and ignore or not look for information to the contrary. In a world with already too much information it seems only logical to do this. Yet, it may adversely affect (strategic) decision making and opinions about people. 

Availability heuristic

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, under the availability heuristic, people tend to heavily weigh their judgments toward more recent information, making new opinions biased toward that latest news.

Nobody knows everything, in fact, one individual knows very little relative to all available information. We have little choice, but to only consider the information that comes to mind for most decisions. However, it is worth remembering to look beyond the immediately available information for the decisions that matter most.


In social psychology, a stereotype is an over-generalized belief about a particular category of people. It is an expectation that people might have about every person of a particular group. The type of expectation can vary; it can be, for example, an expectation about the group’s personality, preferences, appearance or ability. Stereotypes are sometimes overgeneralized, inaccurate, and resistant to new information (and sometimes accurate). While such generalizations about groups of people may be useful when making quick decisions, they may be erroneous when applied to particular individuals and are among the reasons for prejudice attitudes.

In absence of detailed information, stereotyping thus helps the brain ascribing meaning to information by filling in the gaps with assumptions. Stereotyping is a large contributor to social bias, such as gender- and racial bias.

In-group bias

In-group favoritism, sometimes known as in-group–out-group bias, in-group bias, intergroup bias, or in-group preference, is a pattern of favoring members of one’s in-group over out-group members. This can be expressed in evaluation of others, in allocation of resources, and in many other ways.

People look more favourably towards people in their in-group and are more able to differentiate skills and characteristics between people if they are part of the in-group. People in an out-group are more likely to be stereotyped and be impacted by other biases. As such, addressing in-group and out-group dynamics is key in creating inclusive cultures.

How is bias affecting our decision making and our lives (at work)?


The previous section has 

Stories about bias

Implicit Bias - how it effects us and how we push through by Melanie Funchess
Practical diversity: taking inclusion from theory to practice by Dawn Bennett-Alexander

An example: gender bias

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Bias in organisations

What can you do to address bias?

As an individual


As an organisation

Decision making in organisations is impacted by bias in employees. All key processes like recruitment, performance evaluation & promotions, strategic decision making, and the development of algorithms (automatic decision making) are affected by bias. In addition, group dynamics are also heavily affected. What can you do to address bias?

Here we briefly discuss our recommendations. Essentially, the recommendation is the same as for individuals: take conscious and structured decisions. In other words: switch to system 2 where it matters. 

Create a deliberate strategy and use data appropriately

What are your goals? What do you want to achieve and why? Where does bias influence decision making and behaviour most? What are you going to do address this? Answering questions like these in a structured manner is the first step in addressing bias within an organisation. Any and all actions to address bias need to be linked to a deliberate strategy for them to be effective.

Though, the use of data has its own pitfalls (privacy, quality, framing). Data can be useful to look at the situation more objectively and realistically. For instance, say your goal is to increase the diversity in teams by attracting more women. Data can help you determine whether there are enough qualified women available and help you analyse how their careers develop within your organisation.

Most strategies will likely include: increasing awareness of (personal) bias in individuals and teams, reducing bias in key decision making processes, fostering inclusive behaviour, and creating a learning culture.

2. Increase awareness

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

3. Reduce bias in key decision making processes

Which decions processes matter most? Where are they most vulnerable for bias and groupthink? How can these be addressed effectively?

4. Foster inclusive behaviour and a learning culture

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Sluit Menu