n a society that prides itself on grand plans and historic achievements, human beings have an equal and arguably greater ability to commit awe-inspiring failures. An optimist would say we are constantly improving. A pessimist would argue that we’re better at failure than we are at success.
Professor of Management and Entrepreneurship Matthew Semadeni, who is a Dean’s Council Distinguished Scholar, would tell you that we’re a million times better at failing than we are at success — but that’s why we’re so successful.
The difference between failure and success
“Why don’t we learn from success?” Semadeni asks. “We don’t have the motivation to find out what worked and what didn’t.” As the mantra goes: If it ain’t broke, don’t fix it. The result of that, he notes, is that people never learn from their successes, something not true of failure.
When someone fails, it is easier to step back and see what went wrong and what needs to be changed.
“How can we break this pattern of not being able to learn from success?” he asks. “We have to treat success like a failure, so that we can learn from it the way we can learn from failure.”
Prediction is always fiction
Besides the black swan circumstances of the world that could never be anticipated, risk and uncertainty are the two most common and important factors of failure.
The economist Frank Knight explained them well: Risk is like rolling dice; there is a certain probability distribution tied to that toss from which you can make informed predictions. Uncertainty is rolling a die when you don’t know the numbers. There is no way to know how they will fall.
Humans are notoriously bad at prediction because, of course, it is a prediction: There is always a chance of being wrong, whether it’s predicting who will win a baseball game or a toss-up election.
“Prediction is always fiction” is a statement Semadeni lives by.
Prediction is often faulty because people think in the binary: It will rain or it won’t rain, not that there is a 60% chance of rain. Although the probability is the more accurate and reliable way to interpret something and decide, people default to a binary, which clouds our judgment.
People make a litany of other poor choices when making predictions: Weighing personal experiences and hoping for good outcomes are two big ones (both of which Las Vegas has thrived from). Another poor choice is consulting experts.
Expert advice on experts
Don’t rely on them.
Why? Because generally there is no accountability structure. Semadeni argues that most of the time, an expert is no more accurate than a person pulled off the street. He cites Philip Tetlock, one of the authors of Superforecasting: The Art of Science and Prediction, who found that a mere 5% of experts in their field are consistently correct — not even always correct, just consistently.
But that 5% do things differently. They formulate theories using data, then continue to update the theories based on new data.
“It is an iterative process used to improve their prediction,” Semadeni says. “By doing so, they consistently get better at it, rather than just trying to treat each one as an individual event.”
Success through failure
Because not all of us can be in the 5%, there are things that people can do to adapt to the fault they know that they have. Dan Ariely, known for his insight on predictable irrationality, has done research showing that we can account for the mistakes we know we are apt to make, even though we will never stop making them.
“You observe, you begin to describe, then you look for inner relationships, and categorize,” Semadeni says. “Then we go from there to formulate a theory about causal relationships. From that, we make our prediction, and then once we observe and collect the data, we go back. If our prediction is confirmed, we now have confirmation for our theory,” he points out. All through observing behavior and failure.
“During the project, meet and imagine that the worst-case scenario for the project has occurred,” he begins. Then ask the project team the hypothetical, ‘Why did it fail?’”
This allows people who might see something that will go wrong an avenue to tell the group about what they are thinking, risks that can be mitigated before doing it in real life. And this is precisely how people can succeed through failure.