Win/Loss reviews are critical to continuous improvement. Do you conduct them?
Amazingly, for as much effort as we put into winning or losing a deal, I see too many organizations being very casual about analyzing actual performance and outcomes. I seldom see win reviews conducted. Sometimes I see loss reviews conducted–but most often, it’s not a loss review, it’s a reason code in a CRM system–and most of the time it is price or a product reason.
Sometimes I see managers conducting loss reviews, less to learn from what happened, but more to beat up a sales person with, “How can you screw up so badly?”
When we do conduct win/loss reviews, too often we only get part of the picture. We may talk to the sale team only, never going to the customer and asking for their feedback. Or we look at things on a deal by deal basis, but don’t step back to look at the picture more broadly–trying to detect patterns in why we win or lose. “Are we chasing the right customers?” “Are we vulnerable in certain situations or with certain competitors?” “Are we good at small deals, but can’t compete well in large deals?” The list can go on.
But we can’t improve performance, we can’t identify or fix problems, we can’t leverage certain classes of opportunities unless we have a complete picture of why we win and lose.
We need to understand at a deal level what happened and why. We need to interview not only sales people, but partners and customers. In those reviews we have to be open in hearing what people are telling us, not listening with an agenda. Even if they are telling us bad stuff—“your products suck, your service sucks, we don’t like you,” it’s the only way we can learn and identify issues.
We need to look at a complete analysis, trying to assess patterns. How many deals did we compete in? What was the win loss rate, how has it changed over time? What’s our win/loss rate for large deals? Mid-sized deals? Small deals? For certain product categories or mixes, against certain competitors, in certain segments? Getting very granular in the analysis, for example, “for deals between $1-5M, what’s our win rate, what’s the average size of the win, what’s the average cycle time.” Likewise for losses. Likewise, for deals $5-10M, and so on.
Recently, I did an analysis of several thousand deals an organization had competed in. Their win ratio for their largest category of deals was OK. As a consultant, I’m supposed to always say you can do better, but it was OK. But when I looked deeper, the average win of these largest deals was roughly half the size of the average loss of that category of deals. When I looked at sales cycle time, it took them twice as long to lose a large deal as it did to win the deal.
Where they thought they were reasonably good at winning large deals–just based on win ratio, they were actually very good at winning the smallest of the large deals but extremely bad at the biggest deals. We went on to discover a whole number of other things about those types of deals.
It was only after understanding the complete picture that we were able to put our fingers on the problems they had, then start to fix them.
So are you doing win/loss analysis? Are you really doing it, not only understanding things deal by deal, but seeing the patterns of where you are good and where you are bad? Are you really looking at understanding why–not just your own opinion, but from the customer?
Win/loss reviews and analysis is very powerful. Done correctly, they can correct terrible misperceptions we may have about our business. Done correctly, you can always figure out ways to improve performance.