When the Fat Tuesday Sings

As published in the CEO Refresher

For a great many years, the majority of discussions I’ve heard about the Superbowl focused on the ads. This year, of course, was different. Sure, there was plenty of speculation about the ads, but most of the discussion had to do with the New Orleans Saints finally qualifying. It’s not easy to have an even more losing reputation than the pre-2004 Boston Red Sox. At least Red Sox fans knew that their team had won the World Series once upon a time, albeit so long ago that the event was very nearly mythical. Indeed, the Sox qualified many times, only to snatch defeat from the very jaws of victory.

The Saints never got that far. They just lost. Until this year, when suddenly the big news was that they were playing in the Superbowl.

Naturally, the pundits were out in force in the days leading up to the game: detailed explanations for why New Orleans couldn’t possibly win, how the Colts were simply too strong, too well prepared, too skilled a team to be beaten, and so forth. The opinions were logical, well thought-out, and seemed to make perfect sense.

The reality, however, was something just a tiny bit different. On the Sunday before Mardi Gras, the Saints won the Superbowl.

How could so many experts have been so wrong? Frankly, outside of people who are extremely serious about football or people who bet large sums of money on the Colts, probably no one actually cares. In a business environment, however, having the experts be dramatically wrong can be expensive for more than just a few people. It can harm not just the people who made the mistake, but the rest of the organization as well. So perhaps the real question is what can be done to improve decision making accuracy and expert predictions within an organization?

The fact is, all those experts who were predicting victory for the Colts were relying on, well, expert opinion and “previous experience.” In this case, their “previous experience” with the Saints was that the Saints were not particularly good players. The Colts, on the other hand, were well-known to be a strong team. The pundits thus made the mistake of comparing the Colts of today to the Saints of yesterday. What they missed was that something had changed. The very fact of the Saints making it to the Superbowl was a signal that something was different this time around: either everyone else was playing a lot worse, or the Saints were playing a lot better.

In a business, the tendency is to apply expert opinion and previous experience to many situations. When the business is facing a difficult or intractable problem, potential solutions are often evaluated based on opinions of how that solution should work out based on its perceived similarity to some other situation. If the previous situation and the current situation are sufficiently similar, then you can make some reasonable predictions based on the past; indeed, the past is generally one of the most powerful methods available for predicting the future. The ability of an expert to correctly recognize points of similarity and draw valid conclusions from them is a very valuable one.

A break in similarity, however, is a clue that something major may have changed. It is a clue that the previous situation and, therefore, opinions and judgments based on that previous situation, may not apply. When that happens, it’s critical to recognize the change and be willing to disregard all of our expert judgments in favor of a slower, more careful evaluation.

Of course, if the pundits had recognized that the situation was too different to make a meaningful prediction, there wasn’t much they could have done: at some point, only actually doing the experiment, that is, playing the game, will give you an answer. In football, or most other sports, that’s part of the fun: if we always knew in advance who would win, it would be awfully boring.

In a business, though, boring can be good. So what do you do when you’re evaluating a potential solution to a problem?

It helps to look at the points of similarity between your solution to a problem and the situations you view as similar. What is the same? What is different? Do those differences represent a fundamental incongruity between the two situations? Or perhaps you can only see a small piece of the other situation. This is not all that unusual when one business looks at how another business is solving a problem: I worked with one small software company that decided to adopt the Microsoft Way, whatever that was. It didn’t matter though: they were going to price like Microsoft, develop like Microsoft, act like Microsoft. Unfortunately, they weren’t Microsoft. It didn’t work for them. It may have worked for Microsoft, but Microsoft had resources that this company did not. Pointing out that Microsoft didn’t do things that way when they were small didn’t gain any traction.

In this case, it can help to study other companies that look like your company to see how they are addressing similar problems. The greater the similarity, the more likely you are to get valuable information. Sometimes, the present, rather than the past, is the best predictor of the future!

Sometimes, of course, the best way to evaluate your solution is to rely on none of the above: personal experience, expert opinion, even a study of similar situations and companies, don’t provide you with enough valid data to evaluate your solution in the present. In that case, you might have to actually play the game: you need to figure out how you’ll know if your solution is successful in the long-term and the short-term. You need to know not just where you want to go, but also how you’ll know if you’re on track to getting there.

In the short-run, this is the most difficult approach. It involves taking some risks. It may also involve the biggest return.

Or you can settle for predicting the results of the game.