How cognitive bias skews design reviews

Being human means we’re not always logical. Our brains have developed certain ways of thinking that help us understand and simplify the complex world around us in simple ways, but unfortunately, those mental shortcuts can often cause problems for us as well. Psychologists refers to these mental patterns as “cognitive biases”: individual beliefs and tendencies…

Published on by James Archer and Jessica Schultz

Being human means we’re not always logical. Our brains have developed certain ways of thinking that help us understand and simplify the complex world around us in simple ways, but unfortunately, those mental shortcuts can often cause problems for us as well.

Psychologists refers to these mental patterns as “cognitive biases”: individual beliefs and tendencies that influence our perceptions and reactions, and that are often the culprit behind inaccurate judgments and irrational responses.

These cognitive biases can make it difficult to objectively and effectively review design work – especially for those less familiar with the process--making the process more difficult than it really needs to be.

Here are a few of the hundreds of cognitive biases that shape how our brain works, and some insights into how they can affect the design process.

Judgment Errors

Anchoring effect: The first thing we see in a sequence sets the baseline for our perceptions of what follows. We fixate on that initial value and then compare all following option to it. Restaurants use this to their advantage by highlighting expensive entrees or specials, which by comparison makes the less expensive options look more reasonable. Unfortunately, this can make it difficult for people to review designs that have evolved from their initial state (“I thought it was going to have two columns”), or that are significantly different from what they had envisioned before even seeing the design (“I pictured it having more photos”). We get attached to the first picture in our head, and often don’t give the subsequent designs an appropriate chance.

Availability heuristic: We tend to base our decisions on the quick information or stories that come to mind first (“availability”), rather than on an objective review of all the data available, and we overestimate the importance of that data versus other available data. This is the “my grandpa smoked and he lived to be 98” fallacy, in which the one story someone knows is given priority over ample evidence to the contrary. This often comes up when discussing strategic decisions in phrases like “We tried that already and it didn’t work,” “Our competitor does it and it seems to be working for them,” etc.

Survivorship bias: Sometimes the examples we think about when making decisions are skewed because we’re only looking at the surviving examples. We fail to consider the examples that didn’t make it, because they’re much less visible (or not visible at all). We often see this bias with technology startups, who point to the spectacularly successful examples but neglect to consider what can be learned from the vast majority of similar companies that followed the same practices and quietly failed.

Groupthink

Bandwagon effect: Often referred to as “groupthink” or “hive mind,” this refers to the odds of someone embracing an idea increasing based on the number of people who hold that belief. We feel safe in crowds, and we often align ourselves with the perceived majority without even realizing it. An idea naturally becomes more attractive when everyone else is doing it. This is how design trends get started, with entire industries sometimes adopting identical design conventions, fueled by the fact that everyone else in their industry is doing the same (and not questioning whether it’s actually a good idea). This happened with the famous “hamburger menu,” a design convention that received nearly universal adoption in web design despite it generally hindering user navigation.

Confirmation bias: Most people are familiar with the fact that people tend to like people and ideas that agree with their existing beliefs. Our brains love to do this because surrounding ourselves with familiar ideas requires significantly less cognitive work than dealing with things that are new and different. Unfortunately, this bias often trips people up in the design review process because it makes them less likely to accept unfamiliar approaches, new insights, techniques they haven’t seen before, etc. It’s easy to get people to accept design work that meets their preconceived notions, but much harder to get them to accept new and better ideas (which is ultimately what they hire us for).

Ingroup bias: We love to belong to groups, and our brain often relies on groups to help steer our decisions. The tighter the bonds inside the group, the more trouble we have accepting ideas and opinions from the outside. We overestimate the abilities and values of our trusted circle of colleagues, and become skeptical and even afraid of ideas from outside. This can come up in situations where carefully-planned design strategies are overturned the client based on an off-the-cuff comment from a client’s board member, employee, or friend.

Projection bias / false consensus bias: These two are closely related. Projection bias refers to the assumption that other people have the same beliefs or attitudes we do (even when that’s unlikely), and false consensus bias is the tendency to overestimate the degree to which their beliefs and attitudes are typical of others. However you slice it, this is a dangerous one in the design process because it comes out as “I like this, therefore everyone will like this.” This is one of the most difficult biases to deal with with during design reviews because it tends to stick no matter how much evidence we provide to the contrary.

Viewpoints

Selective perception: Related to confirmation bias, this cognitive bias reflects our tendency not to notice (or to more quickly forget) ideas that cause emotional discomfort or contradict our existing beliefs and attitudes. For example, a dedicated sports fan might not notice her own team’s violations, but would certainly notice and get upset about violations by the opposing team. This sometimes comes up early in the design review process when reviewing wireframes, rough concepts, etc., where people might address the issues with which they’re comfortable (like color, for example), but overlook other issues they subconsciously have concerns about but aren’t ready to notice yet. Unfortunately, those problems then come back later in the project, when it’s more difficult to resolve them.

Bias blind spot: More than 85% of Americans believe they’re less biased than the average American. Obviously, those numbers don’t make sense, and the difference between them is made up of people exhibiting the “bias blind spot”: sometimes easily seeing the bias in other people’s decision but failing to recognize or acknowledge it in themselves. This makes it that much more difficult for people to understand where their feelings, responses, and reactions to design are coming from, because they’re less likely to attribute it to the cognitive biases that are likely causing many of them.

Decision making

Choice-supportive bias: We tend to feel positive about our choices, even if they have flaws. For example, someone who loves their dog and thinks highly of them, even though it has a habit of biting people. This is similar to the post-purchase rationalization where we subconsciously justify our purchases. During design reviews, this can cause problems because people will tend to want to reinforce their earlier decisions, instead of acknowledging potential problems.

Ostrich effect: Our brains find it easiest to ignore bad news and negative information, because burying our head in the sand is easier than facing the things we don’t want to hear. For instance, investors check their accounts less frequently when the market has taken a downturn. This sometimes manifests during design reviews when we have to warn a client that something might take longer than they want, will cost more than they’re expecting, or may not work the way they’re hoping. Instead of acknowledging the issue and change course accordingly, people are sometimes tempted to just keep going and hope it all works out.

Illusions & Expectations

Information bias: Sometimes we seek the comfort of additional information, even when it’s not likely to change our minds about anything. We want to make sure we’ve found every possible detail about something. However, more information is not always better, and interestingly enough, people can often make more accurate predictions with less information. During the design process, this can manifest as wanting to dive too deep into something, to explore it from every angle, and to have a design completely planned out before implementing it, when in reality it’s often more efficient and practical to rough something out and then iterate it later based on actual user behavior.

Observational selection bias: When something is on your mind, you’re more likely to notice when you come across it. It’s that feeling you get when you buy a new car and suddenly see it everywhere. We perceive an increased frequency of certain events, but we’re really just noticing it more because it’s on our mind. This sometimes happens in design reviews where we’re trying to convince the client that something isn’t a good solution, but because we’ve talked about it they notice that solution more frequently in other contexts, giving them the feeling that everyone’s using it.

Outcome bias: On the surface, it seems to make sense to judge a decision based on its outcome. However, that’s not always a good idea in practice, because sometimes you make a great decision and it still works out poorly, and sometimes terrible decisions can work out well anyway. This often comes up in the design process in statements like “Well, XYZ company did it, and they’re doing well,” that lead clients into making bad design decisions.

Probability neglect: People sometimes disregard the actual probability of something happening, and are more swayed by the emotional or dramatic weight of the event. For example, far more people are afraid of dying in an airplane crash (761 deaths last year) than in a car crash (1.2 million deaths last year). Similarly, people can spend inordinate amounts of time during the design process worrying about rare-but-evocative use cases and failing to direct their attention to the most common, mundane use cases.

Novelty

Pro-innovation bias: While many people are hesitant about change, there are certainly some that get so excited about new ideas that they hastily adopt them despite potential shortcomings. Design clients are sometimes so excited about things that are fresh and new that they neglect more tried-and-true design patterns that might perform better. This happened with the widespread adoption of the “hamburger menu,” which promised to solve the difficulty of adapting navigation menus to mobile environments, that people overlooked its otherwise self-evident design shortcomings.

Recency bias: The thing that just happened always seems more important that the thing that happened six months ago, which can lead people to make constantly-shifting decisions when it comes to design. We often have to spend a lot of time reminding people of their own decisions early in the project, and trying to discourage them from totally changing their strategy based on something they read about yesterday, something a customer just told them, etc. While it’s good to be flexible enough to adapt to an ever-changing business environment, it can also be toxic to keep changing your mind based on the most recent thing that happened.

Risk & Loss

Status-quo bias: “If it ain't broke, don’t fix it.” Our brains love consistency, and if something seems to be working okay, we’re very reluctant to change it out of fear that things might become worse. This can make it difficult to convince clients to try ideas that we believe will perform better for them, or to do something significantly different than the norm in their industry.

Loss aversion bias: The idea of losing something is often terrifying to us, to the point that we’ll work harder to avoid losing something than we will toward acquiring gains. This is a huge issue in the design process because clients will often be unwilling to do anything that might lose existing customers, even if that shift is something that might bring them twice as many new customers. The numbers don’t make sense, but it’s difficult to shake that fear of losing what you already have.

Conclusion

These are just a few of the many mental shortcuts that can get us lost in the design process. There are many, many others.

Understanding how your brain works, and learning to identify when you’re tricking yourself, can go a long way toward improving the design decisions you make over the course of a project.

Sorry, we couldn't find any entries.