Why Smart People Make Bad Decisions: The Psychology of Bias in Leadership
Exploring How Leaders Miss the Crucial Step of Examining Their Underlying Assumptions
Hosts: Kevin Novak and Elizabeth Stewart
Duration: 24 minutes
Available: October 23, 2025
🎙️Season 1, Episode 3
Episodes are available in both video and audio formats across all major podcast platforms, including Spotify, YouTube, Pandora, Apple Podcasts, and via RSS, among others.
Transcript Available Below
Episode Overview
Smart leaders. Dangerous blind spots. Costly transformation failures. In this episode of The Human Factor Podcast, Kevin Novak and Elizabeth Stewart reveal why intelligence actually creates predictable decision traps that sabotage even the most promising organizational changes.
Drawing from real transformation failures costing millions, they expose three specific cognitive biases that turn your brightest people into your biggest obstacles: addition bias (why leaders complicate rather than simplify), expertise tunnel vision (when deep knowledge becomes dangerous), and certainty addiction (the fear that drives fake confidence).
This isn’t another change management discussion; it’s a psychological deep dive into why 70% of transformations fail and the practical framework that can flip those odds in your favor.
Key Takeaways
Intelligence without Psychological Awareness Creates Transformation Failure, not Success
Addition Bias Costs Organizations Millions in Unnecessary Complexity Annually
Smart Leaders Who Embrace “not knowing” Outperform Those Who Project Certainty
Season 1, Episode 3 Transcript
Available October 16, 2025
I’m Kevin Novak, CEO of 2040 Digital, professor at the University of Maryland and author of the book, The Truth About Transformation and the Ideas and Innovations Weekly Newsletter. Welcome to the Human Factor Podcast, the show that explores the intersection of humanity, technology and transformation and the psychology behind transformation success. I’m joined today by my consulting partner, Elizabeth Stewart, who co-hosts this episode with me.
Elizabeth, Welcome to what I hope will be the beginning of many fascinating conversations about the human side of organizational change.
Thanks, Kevin. I’ve been looking forward to this conversation because it gets to the heart of something we observe constantly through our research and newsletter, brilliant leaders making predictably poor decisions. You know, when we started documenting these patterns four years ago, I thought we’d see random failures across different types of leaders. But what emerged was much more systematic. The same cognitive traps keep appearing regardless of industry, education level, or even experience. It’s almost like intelligence creates its own blind spots.
That’s such an important observation, Elizabeth. Over the past four years, through our Ideas and Innovations newsletter, it reaches over 5,000 leaders. We’ve documented exactly what you’re describing. Intelligence doesn’t prevent bad decision-making. Sometimes, it actually amplifies the mistakes. Today, we’re diving deep into the psychology of bias and leadership. Why smart people consistently fall into the same cognitive traps, and more importantly, how to recognize and overcome them.
But first, let me set the stage with a story that perfectly illustrates today’s theme. Elizabeth, let me start with something that might surprise our listeners. We keep seeing this pattern across industries. Brilliant technical teams with PhD level in intellects decades of experience and access to every piece of data imaginable. Yet their digital transformation projects consistently struggle. Three years in, millions invested and organizations often find themselves less efficient than when they started.
The fascinating part is that their intelligence is actually working against them. They can see every potential problem, anticipate every failure mode, and predict every risk. But that comprehensive awareness becomes paralyzing. And what I find particularly interesting is how this connects to the research we’ve been following on reactive versus reflective decision making. Smart people often react quickly because they trust their intelligence, but they skip the crucial step of examining their underlying assumptions. I mean, it’s like having a high-performance sports car but driving it with the emergency brake on. The intelligence is there. It’s powerful, but it’s being constrained by psychological mechanisms they don’t even recognize.
That’s the first bias we need to discuss. What Psychologists call confirmation bias, but I prefer to call it assumption addiction. Smart people get attached to their initial analysis and then unconsciously seek information that confirms what they already believe.
Can you give our listeners a specific example of how this plays out in leadership?
Absolutely. Remember that engineering team we worked with, their initial assessment was that their current system was too complex. They needed to simplify it. Smart analysis, right? But they became so attached to the simplification that they ignored data showing that the users actually needed more functionality, not less. They spent two years building a simpler system that nobody wanted to use because it couldn’t do what people needed it to do. Their intelligence led them to an elegant solution to very much the wrong problem. Elizabeth, based on our work with organizations and the patterns that we’ve documented in our newsletter, I want to walk through the three most dangerous cognitive traps that intelligent leaders fall into.
And these are the ones that keep showing up regardless of industry, company size, or even leadership experience.
The first trap is what we call addition bias, the tendency to solve problems by adding more rather than taking away. We devoted an entire newsletter issue to this because it is so pervasive.
Kevin, tell them about the marketing director’s story.
Perfect example from our newsletter research. Think about a marketing director struggling with declining audience engagement. Smart person, experienced, data driven, but instead of examining what was working and removing it, the default response is to add more tactics. More social media platforms, more content types, more marketing automation, and more analytical tools.
And what typically happens, Kevin?
Teams become overwhelmed with messages, they get diluted and engagement drops even further. The solution isn’t more, it’s actually less, but better. But intelligent, well-intentioned brains default to addition because addition feels like they’re taking action. The second trap is expertise tunnel vision. The more expert you become in your domain, the more you see every problem through that specific lens.
This is huge in transformation projects. The IT leaders sees every challenge as a technology problem. The HR leaders sees everything as a people problem. And then the finance leaders sees everything as numbers or resource problem.
And they’re all partially right, which makes it even more insidious. But transformation isn’t an IT problem or an HR problem. It’s a human psychology problem that manifests differently across all those domains.
Which brings us to the third trap, and this one might be the most dangerous because it feels so rational. I call it the certainty addiction. And we’ve written extensively about this in our newsletter because it’s so fundamental to human psychology. Smart people hate uncertainty. We’re biologically wired to want clear, definitive answers. Think about it. Most intelligent people become successful by being right more often than they were wrong.
They learned to trust their analytical abilities because those abilities consistently delivered good results and good outcomes. But here’s the problem, Kevin. Transformation, by definition, involves moving from the known to the unknown. You can’t have uncertainty when you’re changing fundamental aspects of how an organization works or operates. The very nature of transformation is that you’re entering uncharted territory.
So what happens when certainty-addicted leaders face transformation challenges?
Well, they create the illusion of certainty through over-analysis, detailed project plans, and exhaustive risk assessments. They convince themselves they’ve eliminated uncertainty when they’ve really just hidden it behind spreadsheets and PowerPoint presentations. And so, when reality doesn’t match their detailed plans, which it never does in transformation, they blame execution rather than examining whether their need for certainty prevented them from building in the flexibility that they actually needed. I’ve seen organizations spend 18 months planning a six-month transformation project. By the time they’re ready to execute, Kevin, the market has changed, customer needs have evolved, and the original problem they were solving no longer exists or has been resolved. Let’s talk about why these patterns exist. It’s not like smart people are intentionally sabotaging themselves.
Their biases actually serve important functions in normal circumstances. What we’re dealing with is essentially a mismatch between our psychological operating system and the demands of transformation. Think about your computer’s operating system and how it handles routine tasks automatically, so you don’t have to think about them. Your brain does the same thing with decision-making. Most of our choices are made by unconscious psychological processes that have worked well, very well for us in the past. Confirmation bias, for instance, helps us make quick decisions with limited information. Addition bias helps us accumulate resources and capabilities when we’re in a growth mode. And expertise tunnel vision allows us to become genuinely excellent at mastering our specific domains.
But transformation challenges our psychological defaults in ways these normally helpful patterns become incredibly destructive.
Exactly. Transformation isn’t a routine task. It is dynamic in nature and definition. It really requires overriding our psychological defaults, and that takes enormous conscious effort. Smart people often resist this because they trust their intellectual instincts, not realizing that sometimes those instincts are calibrated for stable and predictable environments. And here’s what’s really interesting from our research, Fear isn’t just about being scared.
It’s about your brain trying to protect you from uncertainty and potential failure, because nobody really wants to fail or intentionally fail. When smart people encounter a transformation challenge that doesn’t fit their mental models, fear kicks in at unconscious level.
And how does that manifest in leadership behavior?
Three predictable ways. They either overanalyze to create the illusion of control, they default to what’s worked before even when the context has changed, or then they delegate the people’s stuff to others while focusing on the technical aspects they feel confident about. All of which avoid the core challenge. Successful transformation requires leaders to operate outside their comfort zone and outside their expertise. It’s not just intellectually demanding, is psychologically threatening to their identity as competent analytical successful decision makers.
Elizabeth, let’s walk through a pattern we’ve observed repeatedly across different industries and it illustrates these biases compounding each other. Healthcare systems are a perfect example. Brilliant doctors, exceptional administrators, people who literally save lives every day but they often struggle with digital transformation initiatives that can drag on for years with minimal progress.
Just think about the psychological profile of a typical healthcare leader. Often a former surgeon or physician, incredibly smart, methodological, used to having complete information before making life or death decisions. All great qualities in medicine.
Yeah, but transformation isn’t surgery.
Right. In surgery, you can see the problem. You have established procedures and success is measurable and immediate. Transformation is very messy. You’re changing systems, you’re changing culture and human behavior all at the same time.
So what typically happens then?
Leaders approach transformation like they would approach surgery. They want complete diagnostic information before starting detailed step-by-step procedures for every change and measurable outcomes at each stage.
Well that sounds reasonable.
It does, but here’s where the biases kick in. First, addition bias, instead of simplifying the patient experience, there’s a tendency to add new digital touch points to address every possible patient need.
So patients go from having one way to interact with the system to having six or seven different apps and portals.
Exactly. Then expertise tunnel vision, the focus becomes medical efficiency rather than human psychology. How do we get patients to follow medical recommendations? How do we reduce appointment no-shows? How do we improve medication compliance?
Those are symptoms, not root causes.
Right, the real issue is often that patients feel disconnected from their healthcare providers. They need to feel heard, understood, and supported through their health journey. But when your expertise is in medical intervention and not human psychology, you keep trying to solve connection problems with efficiency solutions. You know, that’s the real killer. There’s a demand for more data and the certainty addiction?
More pilot programs, more proof of concepts before rolling out changes. But patient psychology doesn’t work like medical diagnosis. You can’t A-B test your way to trust connection.
So what would the breakthrough look like?
Starting with what we call psychological first principles, beginning with the human psychology of the patient experience, rather than the operational challenge of healthcare systems. Instead of asking how do we make our systems more efficient, you start asking what makes patients feel cared for and confident in their healthcare journey. It’s a completely different question, completely different solutions.
And that means.
Operational efficiencies versus successful outcomes.
Kevin, our listeners are probably thinking this is fascinating. Well, what do I actually do about it? So let’s get practical.
It’s a great question. Based on our research and client work, we’ve developed what we call the human factor decision framework. It’s four questions that help interrupt automatic bias patterns.
Let’s walk through them, please.
Question 1. What am I not seeing? This directly counters confirmation bias. Before making any significant decision, force yourself to actively seek disconfirming evidence.
Can you give us an example?
Sure. Let’s say you believe your team needs better project management software. Instead of researching which software to buy, first ask yourself what evidence exists that software isn’t the real problem? Maybe the issue is unclear priorities or communication problems or resource constraints.
So you’re interrupting the assumption that you’ve correctly identified the problem.
Exactly. Two, what would success look like from the other person’s perspective? This counters expertise tunnel vision by forcing you to see the situation through a different lens.
And this is huge for transformation projects because success looks different to IT, operations, finance, frontline employees and even the customers.
Right, and if you’re implementing new technology, don’t just think about technical success metrics. What would make the frontline employee feel successful? What would make the customer feel successful? Often these are completely different things. What would I need to believe for this to fail? This is about servicing hidden assumptions. Instead of focusing on why your approach will work, identify what would have to be true for it to fail.
That’s question three.
That’s counter-intuitive, Kevin
It is, but it’s powerful. If you believe your communication strategy will work, ask yourself, what would employees have to think or feel for this communication to be ineffective? Maybe they have to distrust leadership or feel overwhelmed or believe the change threatens their job security.
So then you can proactively address the psychological factor.
Exactly. Question four, how might I be wrong in a way I can’t see? This is the hardest one because it requires intellectual humility. Smart people are usually right about most things, which makes us overconfident about our analytical abilities. But transformation involves complex human systems where being logically correct doesn’t guarantee practical success.
And this is where implementation becomes crucial. How does a leader actually build these questions into their decision-making process? I recommend, Kevin, what we call bias audits. Before any major decision, literally schedule time to work through these four questions with people who aren’t involved in the initial analysis. Think of it like a pre-flight safety check. You’re not questioning the pilot’s competence. You’re acknowledging that complex systems require systematic verification.
Like an external perspective?
Internal is fine, but they need to have permission to challenge your thinking. And here’s the key, you have to reward them for finding problems with your logic, not punish them. Most organizations, as you and I know, inadvertently punish people for pointing out flaws in leadership thinking. Someone raises a concern about a new initiative or a new plan and they get labeled as not a team player or resistant to change.
But if you want to avoid bias, traps, you need to create psychological safety nets for people to point out what you’re not seeing. I totally suggest making this explicit. Tell your team, going to present this plan and I want you to try to poke holes in it. The person who finds the biggest flaw gets recognition, not criticism. You’re essentially gamifying the process of challenging assumptions.
That requires changing organizational culture around decision making.
Absolutely. It requires leaders to model intellectual humility as you mentioned and identified before. When someone points out a flaw in your thinking, your response needs to be, thanks for helping me see that, rather than defending your original position. This isn’t about becoming indecisive or appearing weak to your team. It’s about becoming more comprehensively decisive by incorporating perspectives you might have missed.
Kevin, let’s talk about the deeper mindset shift that’s required here. This isn’t just about techniques, it’s about changing how leaders think about intelligence and decision making. The biggest barrier we encounter is what I call intelligence arrogance. The unconscious belief that being smart means you can figure out complex human systems through analysis alone. It’s not malicious or wrong or bad, it’s actually the natural result of years of success using analytical thinking.
But human systems, as you and I know, aren’t engineering problems. In engineering, if you understand all the variables and their relationships, you can predict outcomes. But humans aren’t machines because we have emotions and motivations, fears, and even aspirations that don’t follow logical rules.
So one needs a mindset shift from I need to figure this out to we need to figure this out together. So transformation isn’t a problem to be solved by the smartest person in the room. It’s really a collective journey and it requires everyone’s psychological understanding.
Yes, and this connects to something we’ve written extensively about, the difference between power and influence. Power is your ability to make people do things. Influence is your ability to make people want to do things. Transformation requires influence, not power, because you’re asking people to change fundamental aspects of how they were. And you can’t influence people through analysis and logic alone.
You have to understand their psychological experience of the change and address their fears, their concerns, and their aspirations. It really requires intellectual humility.
So the most successful transformation leaders we observe have learned to say, I don’t know more often, not less often. They’re comfortable with uncertainty and curious about perspectives different from their own.
And this does not make them less effective leaders. It actually makes them more effective because they’re gathering better information and building more sustainable solutions. Think of some examples we’ve encountered with our clients.
One example I can think of is a common scenario we see. A technology leader implementing AI tools across the organization. The typical approach is very analytical. ROI calculations, efficiency metrics, technical specifications. But then the rollout meets massive resistance. Instead of pushing harder with more data and logic, what if the leader shifted to curiosity and started asking employees, what would make this AI tool feel helpful? Rather than threatening to you. Often the resistance isn’t about the technology. It’s about the fear of becoming irrelevant. People worry the AI will eliminate their jobs or make their expertise obsolete. So instead of dismissing those fears as irrational, they could have redesigned the implementation to position AI as augmented human capabilities rather than replacing them. Same technology, completely different psychological frame.
When people feel like partners in innovation, rather than the victims of automation, everything changes. Adoption rates, creative suggestions for new applications, even the overall organizational culture, which is key to behavior around technology.
Let’s wrap up with the key takeaways for our listeners. First, recognize that intelligence alone isn’t enough for successful transformation. In fact, traditional analytical thinking can sometimes work against you when dealing with complex human systems. Your expertise in your domain doesn’t automatically translate to expertise in human psychology.
Second, be aware of the three major cognitive tracks, addition bias, expertise tunnel vision, and certainty addiction. These aren’t character flaws. They’re normal psychological patterns that become problematic in transformational content.
Third, use a human factor decision framework before making any major decisions. What am I not seeing? What would success look like from the other person’s perspective? What would I need to believe for this to fail? And how might I be wrong in a way I cannot see? These questions force you to step outside your analytical comfort zone and consider the human psychology aspects of your decisions.
And fourth, make the mindset shift from individual problem solving to collective journey facilitation. Transformation is about human psychology, not just technical solutions. You need to become comfortable with saying, I don’t know, and generally curious about perspectives different from your own.
And for our listeners who want to dive deeper, we’ve created a transformation readiness assessment based on the psychological principles we discussed today. You can find it at transformationassessment.com.
One word, transformationassessment.com. On next week’s episode, we’re investigating why access to unlimited information is creating decision paralysis and how data noise is killing critical thinking and what we can do to maintain human judgment in our hyper-quantified world.
Until then, I encourage our listeners to try the bias audit technique we discussed. Pick one important decision you’re facing and work through those four questions with someone who can challenge your thinking. If this episode was helpful, please subscribe to the Human Factor Podcast and leave a rating. And if you’re dealing with bias and leadership challenges, share this episode with your team. These insights work better when everyone understands the psychology at play.
For more resources on bias and leadership, visit humanfactormethod.com, again, humanfactormethod.com, all one word, and subscribe to my ideas and innovations newsletter on Substack.
Thank you for joining us for this episode of Human Factor. I’m Elizabeth Stewart.
And I’m Kevin Novak. Remember, transformation isn’t about technology. It’s about people.
We’ll see you next time. Thank you.
Available Everywhere
The Human Factor Podcast is available on all major platforms
Apple Podcasts
Spotify
Google Music
Amazon Music
YouTube
Pandora
iHeartRadio
RSS Feed
Or wherever you get your podcasts
New episodes every Thursday
Upcoming Episodes
Upcoming: Available October 30, 2025
Episode 004: Data Noise and Decision Paralysis: When Too Much Information Kills Critical Thinking
Organizations are drowning in data but starving for insight. Examine the psychological mechanisms behind information paralysis and why access to more data often leads to worse decisions.
