Every leader I know has more information than they can handle. Dashboards, pipeline reviews, forecast calls, win/loss reports, it never stops. We have more tools and AI to provide more analysis and new insights into this flood of information. Despite this, for the decisions that matter most, those about strategy, people, customers, and execution, it seems these are going the wrong direction, constantly eroding performance.
Win rates continue to decline. Fewer people hit their goals. Too often, the wrong people get blamed. We double down on initiatives despite results showing us they are failing. We celebrate hitting the numbers while underlying performance erodes. We congratulate ourselves on customer wins even as few customers renew or believe we are creating value.
The problem is not that we are seeing too little or that we don’t have enough information. It is that we are misreading what we see. And as we act on that misread, the fix doesn’t address the real problem.
And when that fix fails, we apply another, then another. We layer one band-aid on another, and then another, followed by more…. Pretty soon, we lose sight of the original problem. It disappears under layers of failed fixes.
Each one of these fixes focuses on addressing the prior fix, yet the underlying problem is unaddressed. And each layer of fix makes it harder to understand what’s really happening because the real problem is buried so deeply that no one is looking for it. So it persists and we wonder why we fail.
Too often, we attribute this mistake to ignorance or laziness. But this is wrong, the underlying reason this happens is this is simply how human minds work under pressure. When we’re overwhelmed with complexity and time pressure, our minds take shortcuts.
These shortcuts have a name, they are biases.
In reality, biases are efficiency mechanisms, mental shortcuts that allow us to process enormous amounts of conflicting information. Biases help us make sense of the overwhelming information we deal with every day. Sometimes, these biases serve us well enough, not perfectly, but we get by.
There are any number of biases, many of which we know: confirmation bias, anchoring, sunk cost, self-serving bias, availability bias, attribution error, or the Dunning Kruger effect.
Researchers have studied each of these extensively. However, it is less important to understand what each of these mean. It is far more important to understand they share a common pattern that is always present.
Here’s what they have in common. We form an opinion or point of view very quickly. The opinion feels right. It fits the visible evidence, it’s consistent with what we already believe. It provides a sense that feels like insight.
And however, we come to that conclusion, we stop looking. We act on the initial assessment. But the things that might have told us we were wrong never get examined.
For example, confirmation bias causes us to see what we want to see, not what might be really happening. Every new piece of information is evaluated based on how well it reinforces our point of view.
When things happen, creating the outcomes we want, we attribute them to our leadership and astute decision-making. But when the outcomes are the opposite of what we expected, the reasons are external factors, things outside our control. A poor performer, bad market conditions, competition, something else. This is the self-serving bias.
Dunning-Kruger is one of the more famous. It’s fundamentally the challenge of over-confidence and the view that we know/understand better than we really do.
While the specific bias mechanism is different, the results are the same. We reach a conclusion too quickly, hold on it too tightly, act on it too confidently, and fail to understand the reality that underlies the failures these behaviors produce.
This is complicated, because organizations are full of doing this simultaneously. One may be a confirmation bias, another may be acting off sunk cost bias, and others have different biases. But as a result of these being at play, concurrently, the distortions multiply.
One leader may be protecting a strategy, another may be focusing around a specific narrative about the team, another may be reacting to the most recent problem that popped up. None of these people are being dishonest, but each of them is working from an incomplete understanding of what’s really happening.
And none has identified and is acting on the real problem.
Recognizing this is the reality that every organization faces, what do we do about it?
Some thoughts:
Slow down and don’t leap to conclusions or answers. Bias does its damage when an explanation suddenly feels obvious and complete. It’s at this moment of clarity or insight that it’s worth pausing, asking the question: “Am I sure this is the issue, or am I reacting to this because it’s comfortable and familiar?” We tend to stop analyzing when we have reduced something to the familiar.
Ask, “What could prove this wrong?” Test your conclusion by not accepting it blindly, think about what you would look for to determine it might be wrong, that you may have reached the wrong conclusion. Sometimes we call it “red-teaming” an idea — looking for holes or flaws. In scientific theory and research, the key aspect of the research is not to find supporting evidence, but to try to find out where you can be wrong. Doing this leads to much better decisions.
When something goes wrong, first look inward before looking outward. We have the tendency to look outward, assigning blame or looking for excuses. It is more useful to first look inward. How did our decisions, our strategies, our execution contribute to the failure? What might we have done differently.
Widen the evidence base before acting. We tend to do the opposite, we narrow our focus. Perhaps looking at the most recent or the most obvious problem. A data point shows our pipeline is weak, rather than focusing on greater volume/velocity, it is more useful to take a broader perspective to see “is this a volume problem, or is it something else.” Sometimes the loudest voice in a room drives the decision. A large unhappy customer or the actions of a competitor doesn’t represent the market. Before shifting strategies based on the view of that one customer, it might be better to look more broadly at the market.
Stay close enough to the real work to understand what it really requires. The leaders most distant from the actual execution are often most confident about why execution is failing. But that confidence fails to recognize the complexity in how the work really gets done. The more distant you are from the work, the more likely you don’t understand the work. For leaders it means spending time on the front lines with people responsible for the work. It means visiting customers to see what really drives them.
Seek other input and other points of view. We are stronger and act more impactfully, when we actively ask others for their perspectives and ideas. Ask the people doing the work for their assessment or ideas. Ask others for their thoughts. Don’t rely on the same people but actively seek new ideas and points of view. Research has shown we reach higher quality decisions, when we involve a more diverse group in making the decision.
None of these requires identifying the specific bias in play. All biases tend to limit our deep understanding of what’s really happening. As I mentioned earlier in this article, biases are efficiency mechanisms, not effectiveness or impact mechanisms.
Leaders and organizations that constantly outperform others aren’t the ones with the most data or experience. They are the ones that have learned to treat their own ideas or conclusions as the starting point for exploration, not the conclusion.
Afterword: Here is a fascinating AI generated discussion of this post. I really love their deep dive and perspective on biases. Enjoy!

Leave a Reply