10 Decision-Making Errors that Hold Us Back at Work

read time - icon

0 min read

Nov 02, 2022

“Here’s a right-angled triangle. If one side has a length of 5, and the other 12, what is the length of the third side?”

This is the kind of math question you’ll usually find at the end of the textbook chapter on Pythagoras’s theorem. You’re well equipped to solve it — after all, you just finished learning the requisite skills, and there is only one correct answer.

Contrast this with the kinds of problems we solve in the workplace. Early in our careers, we’re faced with relatively straightforward requests: “I need a summary of these three reports.” “Fill in the spreadsheet with the newest data from the trade press.” “Need to turn this into a fancy PowerPoint.” These requests are concrete and predictable, and we tend to handle them well.

But as we progress in our careers and start dealing with more serious matters, problems appear in increasingly complex, ambiguous forms. There can be messy situations with unclear causes, where the problem is not clearly defined for us

Think along these lines: “Our employees are resisting our organization’s digital transformation efforts.” “Our new product isn't selling.” “There are too many safety violations in our factory.” 

We’re also tasked with hitting goals that we have no idea how to reach. “We want to become the market leader in X.” “We want to increase our revenue to XX million.”

Managing situations like these can be taxing and may feel beyond our ability. It’s only natural, considering that most of us have not had the opportunity to learn decision-making and problem-solving tools through formal education.

Behavioral Science, Democratized

We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices. 

At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.

More about our services

​​Out of the frying pan, into the fire

Some 70 years ago, the influential educator Jacob Getzels1 made a key observation: The problems we are trained on in school are often quite different from the ones we encounter in real life. In school, problems tend to present themselves in a nice orderly manner, clearly defined; they’re accompanied by many similar examples, all organized at the end of the appropriate chapter.

The formal education system has not changed much since Getzels made his observation. Decision-making was considered an innate ability, something we would pick up on our own by observing the world and mimicking others (like learning to speak). It was assumed that this mental competence develops like language skills through “biological maturation, social interaction, and conventional learning.”2 As such, our education has focused on teaching specific skills: numerical competency, how to build a spreadsheet, and so on.

What we know now is that decision-making itself is actually a learned skill. It takes much more than a spreadsheet to make a good decision in real life. There is an irrational side to our thinking — one that is, thankfully, largely predictable and open to taming. 

But without formal guidance about the appropriate tools and frameworks, there is only so much we can learn from experience. In addition, in our modern society, where we have access to more information and more choices than ever before, sound decision-making has become even more vital — and more difficult. 

As such, most of us have big gaps in how we make decisions during most of our adult lives. These gaps ended up being filled by misconceptions that hold us back from excelling at the workplace and beyond. 

What are some of these critical misconceptions? Below, I’ve laid out some of the most common decision-making errors we commit.

1. Ignoring similar problems

Every problem is unique in its particulars. But at the same time, once you look beyond the specifics, many problems share a “conceptual skeleton,”3 meaning they are the same type of problem. Chances are the same type of problem you face at work today is a problem that someone (in another department or industry) faced yesterday, and that someone else will face tomorrow. 

Solution: By asking your network or researching online, you may be able to discover someone else who has gone through the same process before, and you can learn from their experience. That’s what Jordan Cohen did at pharmaceutical company Pfizer, when he set up PfizerWorks to outsource tasks such as Excel, PowerPoint and research.4 Confronted with issues of cross-cultural communication between front-line staff, he and his team were unable to find good solutions from past experience or within the industry. Instead, their solution came from the hospitality industry: they hired people directly from international hotel chains, who were adept at communicating with both locals and people from international cultures. 

2. Relying solely on past experience

When relying too heavily on a single point of information to make a decision, you fall prey to the anchoring effect. Anchoring, as its name suggests, involves giving disproportionate weight to a single piece of information (usually the first one you encountered) at the expense of subsequent, potentially more useful information. 

This is an effect of priming. When people are exposed to a given concept, it primes the areas of the brain related to that concept to remain activated. This makes the concept more easily accessible, and more able to influence people’s behavior without their realizing it. A comment offered by a colleague, a statistic in the morning newspaper, a past trend, or last quarter’s sales figures may feature disproportionately in an exercise of sales projections. This is likely to backfire, as our world is dynamic; each situation is different and a sample size of one is hardly strong evidence.

Solution: To overcome anchoring, it’s best to bring others into the decision-making process. Ask for their views and be mindful not to anchor them to your own ideas. Tell them as little as possible about your own estimates and tentative decisions. Seeking information and opinions from a variety of people helps to put anchoring information into perspective and challenge the wisdom of your initial approach.

3. Rushing into problem-solving mode

Our society rewards efficiency and so we strive for it. That is, we strive for maximum productivity with minimum wasted resources. In our modern world of demanding work schedules and looming deadlines, family responsibilities, and social obligations, our time and energy are precious assets. And so we do what we can to protect them. 

What’s more, we have come to associate quick with smart. Think of how teachers often commend the student who finishes first, and all the television game shows where speed is a crucial element in managing to leave with a decent prize.

As a result, we may rush to make a decision — any decision — instead of taking the time to fully understand the problem. We may accept the way the problem is initially presented to us, even if there may be a different framing that would be more productive. We narrow our focus and foreclose the possible solutions available to us. 

An overly narrow or incorrect focus is likely to solve the wrong problem, or only partially solve the problem at hand. In the worst-case scenarios, this may even lead us to exacerbate the problem. We end up wasting our energy on the wrong things, “fiddling with small variations of the same useless solution, until we run out of time or money.”5 Or as TIME magazine6 famously put it, we may end up “rearranging the deck chairs on the Titanic.”

Solution: First try to frame the problem by writing it down, including the six elements of who and what, how much, where, when, how, and why (visualized as pizza slices).7 But before diving deeper into the details, try to look outside this initial frame. Question your own beliefs and challenge long-held assumptions. Keep in mind that most problems have multiple causes and thus have multiple viable solutions. The first task in problem-solving, as Thomas Wedell-Wedellsborg aptly explains, is to find a better problem to solve.

4. Ignoring other stakeholders

Oftentimes, we fail to factor into our decision the impact on all the people involved. This is very common in organizational change efforts. Studies show that the majority of major corporate change programs fail, often because human resource barriers (such as lack of employee involvement and motivation) are overlooked.8 In addition to backfiring, such “isolated” decisions are likely to cost a few relationships and cause more problems in the long run. 

Further, we may omit the input of relevant stakeholders even when we are called in to solve problems on their behalf (often as consultants, managers, or subject matter experts). Basking in our abilities to save the day, we may forget to tap into the proprietary knowledge of the people who are caught up in the situation — the ones who know the problem best.

Solution: Taking the perspective of others is central to behavioral science methodology and human-centered approaches. In situations such as organizational change, this means making sure that it is as easy and enjoyable as possible for all stakeholders to engage with the process. In situations we are called in to solve problems on behalf of others, a simple approach is to ask people, “What would it take?” so as to empower them to contribute and encourage commitment to implementing the solution.

5. Ignoring behavioral biases in information processing

In an ideal world, rational people who encounter new facts that contradict their beliefs would change their views accordingly. But cognitive psychology and neuroscience studies9 have found that people form opinions based on emotions — such as fear, contempt, and anger — rather than relying on facts. New facts often do not change people’s minds.

Other behavioral biases also sway our thought processes in important ways. For instance, belief perseverance kicks in to shield us from views that may threaten our identity in some way, causing cognitive dissonance. Working in tandem with status quo bias, this may lead us to reject a new, better method that renders our process outdated. 

Another cognitive bias that can get in the way of changing your mind is confirmation bias: the tendency to seek out and give greater weight to information that agrees with your preconceived beliefs and positions, and to avoid information that contradicts them. For instance, in the case of an auditor, this tendency may mean only seeking evidence that is consistent with a supervisor’s or client’s explanation for an unusual pattern in the financial data.

A 2016 Gallup poll10 provides a great example11 of how we choose to deal with facts. In the two-week period before and after the 2016 U.S. election, both Republicans and Democrats drastically changed their opinions about the state of the economy. Nothing was new with the economy except that Republicans won the presidential election, taking control of the White House away from the Democrats. The election outcome changed people’s interpretation of how the economy was doing: confirmation bias led Republicans to rate their economic confidence much higher now that their guy would be in charge, while for Democrats the opposite way true.

Solution: Becoming aware of these cognitive features does not mean you can eradicate them. But you can build tests and tools12 into your decision-making process that can uncover errors in information processing before they become errors in judgment. For instance, team leaders can use tactical games such as playing devil's advocate (also known as red-teaming): getting someone on the team to argue against the decision you are contemplating, no matter what they actually believe.

6. Learning from failure — but not from success

Our society celebrates failure13 as a teachable moment. This notion has been repeated by many well-known figures, from Albert Einstein to Winston Churchill. But recent research14 has found that failure teaches us less than we think. In fact, failure can undermine learning. We learn from others’ successes, from others' failures, and from our own successes. But we don’t learn as much from our own failures. 

One key reason for this finding is that failure is ego threatening: it makes us feel bad about ourselves, which causes us to tune out and not pay attention. But a prerequisite to learning is paying attention. Without due attention, we fail to learn from our failures on account of our ego.

Solution: To learn from failure, unless we are experts in the given field, we need to find ways to make ourselves feel less threatened by it (without blaming someone else for our failure or denying that it happened in the first place). For example, we can let some time pass between upsetting events (such as receiving a rejection letter) and when we sit down to go through it and see what we did wrong. Adopting a growth mindset and practicing psychological distancing are also recommended. When delivering negative feedback to others, try to sugarcoat it a little. This isn’t just to spare their feelings; it also softens the blow to their egos and gives them a better chance to learn from the experience. 

7. Going with your gut

Intuition is the work of our subconscious brain pattern matching current inputs with past experiences and making a quick assessment. Following a decades-long debate on its usefulness, subject gurus Gary Klein and Daniel Kahneman have concluded that going with one’s gut works only in “predictable situations with opportunities for learning.” Simply put, there are two key criteria to consider before making gut choices: 

  • Whether the decision-making environment is one of high validity (i.e. one where someone can learn through feedback and reliable cues that hint at the right answer); and
  • Whether the decision-maker has had adequate opportunities to practice their judgment (a minimum of 10,000 hours, as the psychologist K. Anders Ericsson famously estimated). 

A firefighter running into a burning building or a professional tennis player protecting their serve both use intuition which, if experienced enough, will often lead them to excellent decisions. Both of these fields tend to be stable: the behavior of tennis balls or fire won’t suddenly change and render the experience invalid.15

Management, however, isn’t a stable field. It’s a mix of situations: some are repeat situations where experience-based intuitions are valuable, and some are new situations where intuitions are worthless. In addition, hindsight bias and lack of timely, accurate feedback hinder the learning process. As such, management is in need of multiple decision strategies beyond pure reliance on intuition. 

Solution: When dealing with management and business problems, assess whether it's a situation that calls for gut thinking or other decision strategies. Trust your gut in situations that are stable, where you have had adequate opportunities to practice your judgment while getting reliable feedback on your performance. Wherever these conditions aren’t met, look to external data and more objective indicators to guide your decisions.

8. Using ineffective brainstorming methods

Brainstorming became popular in the early 1950s with the promise of producing more ideas as a group. But it never worked as well as expected. No study has shown that group brainstorming produces more alternatives than individuals working alone for a while and then coming together to share their ideas and build on them.16 In fact, on average, individuals perform better than groups in generating answers. 

This is the result of cognitive pitfalls and social tendencies such as groupthink, fear of judgment, and shedding responsibility to others. Extroverts tend to dominate introverts, hampering their contribution. Not to mention sharing one idea at a time can be incredibly inefficient and time-consuming in modern business reality. 

Solution: There exist various improvements17 to the traditional brainstorming method, such as the question burst method (brainstorming for questions rather than answers), brainswarming (switching from talking to writing on a structured graph), and anonymous brainstorming (submitting ideas in writing, followed by silent voting).

9. Taking customer feedback at face value

As renowned anthropologist Margaret Mead is thought to have said, “What people say, what people do, and what people say they do are entirely different things.” In a business setting, this means that asking customers directly about their decision-making is not a good idea. People can post-rationalize their behavior, meaning that they invent explanations for their own behavior after the fact — and these explanations may not be accurate. 

In addition, the impact of the environment and surrounding stimuli is crucial. Taking a person out of the environment in which they make judgments creates the risk that, however good their intentions may be, their responses won’t reflect how they will think and act when those influences are present. These are the issues with surveys we fill out of the comfort of our homes. 

Moreover, these problems are compounded when the way questions are framed alters what people think and answer, inadvertently “leading the witness.” As a result, listening to customers through traditional market surveys is not nearly as reliable as observing customers’ actual behavior and designing behaviorally-informed consumer journeys.

Solution: Research tools that rely on observation, psychometric testing, or interviews conducted by professionals result in more reliable insights. In addition, behavioral design tools such as behavioral mapping and identification of cognitive barriers and benefits, followed by live testing, can also help in giving the customer what they actually need. 

10. Making decisions under pressure

We are not always in a good state to make a decision. When we are stressed, physically or emotionally, we are more likely to make a mistake in decision-making. For example, when we feel overwhelmed, hungry, angry, hot, or tired.

Our stress system is designed to prepare us to act — to run away from threatening things that might harm us, or to fight them off. It has served us well in the dangerous savannah when dealing with all kinds of life-threatening situations. But in the face of modern problems, it has become antiquated. Nowadays, it can feel life-threatening when you are not prepared for a presentation at work, when you have to deal with a dissatisfied client, when the stock market goes down, or when you have to take an important exam — even though these problems demand different solutions than fleeing or fighting. On its own, our stress system is not adapting the way it could to modern life.

When we’re stressed, cortisol, adrenaline, and other hormones course through our system, hampering other functions like cardiovascular activity and memory. This makes it difficult to arrive at a sound decision if you need to recall helpful data or information (which is the case with most decisions). 

Stress also narrows our vision, making us focus on the negatives of a situation (the “life-threatening” part), ignoring the upside. We enter a mode of running away to save ourselves rather than thinking about the situation and trying to figure out a viable solution.18 Effectively, this means that when we are stressed, physically or emotionally, we are not in a good state to make a decision.

Solution: We can adopt a "stress-is-enhancing" mindset19 — looking at stress as an opportunity for growth and learning, an enhancement to performance and productivity. Affect labeling can also help in recognizing the source of stress and better managing it, as can organizational processes that leave some buffer before final decision-making.

Final words

In conclusion, good decision-making is an essential life skill most people only acquire in fragments, through trial and error. Luckily, academics, behavioral scientists, and other decision-making experts (such as former poker players) are working towards uncovering and communicating how our brains work and how to make good decisions. 

Through strategies like pausing to reframe the problem or seeking feedback from outside our echo chamber, we can acknowledge the forces within our brain and find the appropriate tools to work around them. As scientists continue to decipher the workings of the brain, we will only become better equipped for making decisions and solving problems. Whatever decision models and processes you employ in your organization, keeping the above decision-making errors in mind and trying to overcome them can have a tremendous upside.

References

  1. Wilgoren, J. (2001, April 15). Jacob Getzels, 89, Educator And Researcher on Creativity. The New York Times. https://www.nytimes.com/2001/04/15/us/jacob-getzels-89-educator-and-researcher-on-creativity.html 
  2. Smith, G. F. (2003). Beyond Critical Thinking And Decision Making: Teaching Business Students How To Think. Journal of Management Education, 27(1), 24–51. 
  3.  Douglas Hofstadter, author and cognitive scientist
  4. Thomas Wedell-Wedellsborg, T. & Miller, P. (2009). Jordan Cohen at pfizerWorks: Building the Office of the Future. IESE Publishing & Harvard Business Review 
  5. Judkis, M. (2012, April 12). Literally, rearranging the deck chairs on the Titanic. The Washington Post. https://www.washingtonpost.com/blogs/arts-post/post/literally-rearranging-the-deck-chairs-on-the-titanic/2012/04/12/gIQAqKhbCT_blog.html 
  6. Thomas Wedell-Wedellsborg (2020). What's Your Problem?: To Solve Your Toughest Problems, Change the Problems You Solve. Harvard Business Review Press
  7.  Dan Roam (2009). Unfolding the Napkin: The Hands-On Method for Solving Complex Problems with Simple Pictures 
  8.  Mosadeghrad, Ali & Ansarian, Maryam. (2014). Why do organisational change programmes fail?. International Journal of Strategic Change Management. 5. 189. 10.1504/IJSCM.2014.064460. 
  9.  Flynn, D. J., Nyhan, B., & Reifler, J. (2017). The nature and origins of misperceptions: Understanding false and unsupported beliefs about politics. Political Psychology, 38(Suppl 1), 127–150
  10. Jones, J. A. M. J. B. M. (2022, March 24). U.S. Economic Confidence Surges After Election. Gallup.com. https://news.gallup.com/poll/197474/economic-confidence-surges-election.aspx 
  11. Bellizzi, K. M. (2022, August 11). Cognitive biases and brain biology help explain why facts don’t change minds. The Conversation. https://theconversation.com/cognitive-biases-and-brain-biology-help-explain-why-facts-dont-change-minds-186530?ref=refind 
  12. Build acumen for information-laden decisions. (n.d.). Meta-decisions. Retrieved November 2, 2022, from https://www.meta-decisions.com/build-acumen-for-information-laden-decisions
  13. Gagnon, M. (2021, December 15). The success of failure. CBC. https://www.cbc.ca/radio/ideas/the-success-of-failure-1.6283745 
  14.  Eskreis-Winkler L, Fishbach A. Not Learning From Failure-the Greatest Failure of All. Psychol Sci. 2019 Dec;30(12):1733-1744. doi: 10.1177/0956797619881133. Epub 2019 Nov 8. PMID: 31702452.
  15.  Justin Fox (2015). From “Economic Man” to Behavioral Economics. Harvard Business Review
  16.  Tony McCaffrey (2014). BrainSwarming: A new approach to finding solutions. Harvard Business Review
  17.  4 modern brainstorming techniques. (n.d.). Meta-decisions. Retrieved November 3, 2022, from https://www.meta-decisions.com/4modern-brainstorming-techniques
  18. Alia J. Crum, Modupe Akinola, Ashley Martin & Sean Fath (2017): The role of stress mindset in shaping cognitive, emotional, and physiological responses to challenging and threatening stress, Anxiety, Stress, & Coping
  19.  Crum, A.J., Salovey, P., & Achor, S. (2013). Rethinking stress: the role of mindsets in determining the stress response. Journal of personality and social psychology, 104 4, 716-33

About the Author

Melina Moleskis

Melina Moleskis

Dr. Melina Moleskis is the founder of meta-decisions, a consultancy that leverages management science and behavioral economics to help people and organizations make better decisions. Drawing from her dual background in business and academia, she works with determination towards uncovering pragmatic, sustainable solutions that improve performance for clients. Melina is also a visiting Professor of Technology Management as she enjoys spending time in the classroom (teaching as the best route to learning) and is always on the lookout for technology applications in behavioral science. In her prior roles, Melina has served as an economic and business consultant for 7 years in various countries, gaining international experience across industries and the public sector. She holds a PhD in Managerial Decision Science from IESE Business School, MBA in Strategy from NYU Stern and BSc in Mathematics and Economics from London School of Economics.

Read Next

Insight

Unpacking the Stats: Digital Mental Health Interventions

​​In 2023, The Decision Lab conducted a comprehensive survey with over 700 participants. Questions spanned across our focus areas, including emerging technology, mental health, and personal and professional growth. Let's delve into the findings.

Insight

Political Persuasion: Rethinking The Rhetoric That Resonates

Words can be tricky. Especially in political conversations.  

We may think we’re communicating clearly, but the fact that online environments are proven hotbeds for hostility should cause us to pause and reconsider: What if we’re incorrectly detecting disagreement because someone uses words differently than we do?

Notes illustration

Eager to learn about how behavioral science can help your organization?