Contact Sales
I’d like to learn more about:
Checkmark Illustration
Thank you!
You'll hear back from our team shortly.
Close Form
Oops! Something didn't work. Please try again.
No items found.

How cognitive bias influences Design

David Dylan Thomas shares prosocial and inclusive ways to address bias
How cognitive bias influences Design

David Dylan Thomas never set out to become an expert on how cognitive bias influences design. Though it’s a subject on which he has quite literally written the book, his initial interest in it was personal. 

“Years ago, I saw this amazing talk by Iris Bohnet called Gender Equality by Design,” he says. “The talk really got to the core of many of our implicit biases being ruled by pattern recognition. When I saw that something as corrupt as gender bias could come down to something as simple and, dare I say, human, as that, I knew I wanted to learn more.”

And so he did. After seeing the talk, Thomas set a goal to learn about one new cognitive bias every day until he had amassed a near-encyclopedic knowledge of 150+ possible biases. He couldn’t get enough. “I turned into the guy who wouldn't shut up about it,” he admits. He even started a podcast about it—called, you guessed it, The Cognitive Bias Podcast.

Eventually, Thomas’s passion for understanding how these biases influence human behavior did spill over into his work life. Today, he interacts with the concept quite a bit in his work as a creative strategy advocate at Think Company, and he even teaches workshops on inclusive design. 

His book, Design for Cognitive Bias, speaks specifically to how designers can approach their process in a way that tries to address and account for the inevitable ways our brains will try to find shortcuts — for users, stakeholders, and designers. Here, we summarize some takeaways about what designers can learn about the topic and how they can weave it into their work in a way that is both prosocial and inclusive.

Types of bias

Everyone has a lens through which we understand and make decisions. Bias is what happens when that lens gets warped, causing us to make the wrong decisions. Thomas says 95% of cognition happens below the threshold of conscious thought. That’s a lot of potential decision-making power taking place that we’re not even aware is happening. 

Our brains try to assist us in our decision-making by recognizing patterns and creating as many shortcuts as possible. But that's where the problems start. These shortcuts are influenced by what we already assume about the world. And when no one’s thinking about these biases, they will run amok. Let’s look at the different types of biases that can influence the design process.

User bias

Users are, ultimately, driven by the perception of ease. Notice, we said, “perception.” Even if a great deal of information is relayed to the user, if it’s presented in a way that seems digestible, readable, and straightforward, most people will be on board. Users also tend to believe things that fall into this category. 

Where designers can run into trouble is not considering more than one type of user. Think about developing an app. What’s “easy” to a tech-savvy Gen Z user could be impossible to navigate for an older user who’s not as comfortable with new technology. It’s essential to use what we know about user bias (users crave ease) and, as designers, filter that through our own biases.

Stakeholder bias

When it comes to design, the term “stakeholders” can describe anyone from clients to bosses to entire organizations; that is, anyone a designer works with who ultimately has decision-making power, either directly or indirectly. 

Sometimes, the inability to move a design-forward comes down to the seeming impossibility of satisfying each stakeholder’s point of view. Let’s go back to the scenario of developing an app again. If you ask someone on the sales team what an app should achieve, they’ll have a different answer than someone on your development team. That’s their bias.

Letting a single stakeholder point of view slanted in favor of a particular bias could spell disaster for your app. 

Designer bias

Then, of course, there’s designer bias. This refers to the unconscious patterns designers use while they’re creating something. And few designers are impervious to bias — even those who think they’re neutral fall prey to it. 

“Any design is going to manipulate your users in some way, even if you don’t realize it,” Thomas says. “So it’s important to understand that you have a responsibility to be as thoughtful in your design as possible to make sure that however your design is influencing behavior, it's doing it in a positive, pro-social way.”

Any design is going to manipulate your users in some way, even if you don’t realize it.

David Dylan Thomas | Author, Design for Cognitive Bias

One of the most common biases designers are susceptible to is confirmation bias, which is when someone seeks out information that supports the view they already have. One doesn’t have to think too hard about how this can be a dangerous bias to go unchecked. Let’s use the app example once again. If five white males comprise a team tasked with designing a new app, there’s a real danger that they end up making an app whose usability is tailored toward — you guessed it — white men. There’s a pretty good chance you’ll fall prey to confirmation bias if you’re not seeking out diverse points of view along the way.

Understanding and overcoming biases

Now that we have a rundown of the types of bias that influence design, how do we work with and through these biases? Thomas has several tips to consider.

Understanding user bias

It’s important to understand where users’ decision-making influences come from. Knowing that they crave ease, predictability, and believable designs is something you can lean into for a better experience. Conducting user testing with a diverse pool of test subjects will help steer you in a direction that’s universally beneficial, rather than maddeningly specific.

Address stakeholder bias with a common goal

As we mentioned earlier, stakeholder bias is dangerous when it leans too heavily to one side, especially if that slant goes undetected. Depending on how stakeholders are incentivized, they may have very different ideas of what success looks like, and those ideas can distract them from helping the user. 

A good way to combat this, says Thomas, is to get everyone on the same page from the beginning. From the moment a project kicks off, everyone should know exactly what the goals are and what success will look like. Make sure both the goals of the project and the definition of success are unanimously agreed-upon and clear. This will serve as something you can continuously come back to when there’s disagreement about a design decision. Does the decision serve our ultimate goal? Then we move forward with it. If it doesn’t, we don’t.

Combat designer bias with built-in exercises

Red team, blue team

This exercise works by assigning distinct roles to two separate groups. The “Blue Team” is the design team; they conduct all the research, brainstorming, and wireframe-building. When they’re finished, the “Red Team” comes in and tries to find as many weak spots as they can in the design. Thomas calls this “going to war.”

“They are to find every little unseen flaw, every overlooked potential for harm, every more elegant solution [the other team] missed because they were so in love with their initial idea,” Thomas says. Ultimately, this exercise is efficient and gets as close as possible to removing confirmation bias from the equation. 

Assumption audit

Another exercise Thomas recommends is to conduct an assumption audit. This is carried out by first listing out all of the identities that are represented on the design team. Once this is done, the team then lists out as many identities as they can that are not represented. From here, the team works together to explore how the identities that are (and are not) represented will influence the design. That is, all the assumptions, blind spots, and biases that will get in the way of creating a design that is truly pro-social. 

The sooner a design team can get all this down on paper and discuss potential problems at length, the sooner they can start proactively thinking of the checks and balances they can put in place to challenge and try to correct those biases along the way. 

Speculative Design

In his book, Thomas refers to speculative design as the “abusability test.” Engaging in this exercise involves listing out all the possible ways someone could misuse your product. A “fun” way to approach this, says Thomas, is to pretend you’re generating ideas for Black Mirror episodes. 

Thinking about how someone could take something you’ve made and use it in a harmful way is a great way to check your biases. You can even incorporate this process into the Red Team/Blue Team exercise. 

Build bias awareness into your design process

Thomas believes designers must build bias consideration into their process. 

“Make it a practice,” he says. “Nothing ever gets done by building a committee.” In other words, all the exercises your team can use to find bias should be part of your practice, not something you tack on at the end to say that you did your due diligence. Think about the ways you can build this into your design tools like Abstract. Think of it as ‘Bias QA’. You can explore different paths within your tools and try to ‘break’ them with different biases. 

Either way, Thomas says that the most important way to combat bias is to bring people into the conversation early and often who have different or complementary biases to your own team. And we may be biased, but we tend to agree with this approach. 


David’s book Design for Cognitive Bias is available to purchase through A Book Apart. To learn more, check out David’s website or follow him on LinkedIn and Twitter.