In the past few years I have taken a keen interest in the literature of uncertainty and randomness. James Gleick's classic Chaos sparked the interest 20 years ago but the work of philosopher and polymath Nassim Nicholas Taleb has re-kindled my curiosity in recent years. The popular books of mathematician Leonard Mlodinow, gambler and political forecaster Nate Silver and behavioral psychologist Philip Tetlock have also proved fascinating.
Once you start to understand uncertainty and randomness you begin to see its effects everywhere in every aspect of life. We each play a role in so many complex systems we are often blind to the effects of random chance. But it is constantly present.
The crux of what I've learned and how it relates to work life is that humans are terrible at predicting and anticipating how complex systems will behave in the future. This is a huge problem, given we constantly rely on predictions and expectations of what the future holds. It also means what I had long suspected, that our ability to evaluate and manage risk is fundamentally compromised.
The concept of risk management has a long history in many industries and professions. Even if it is not specifically identified and documented in risk management plans, we all spend time managing risk.
I've been involved in creating, maintaining and managing project risk for most of my career. I've had many opportunities to see others performing these tasks also. And my observation and reflection lead me to agree with the literary evidence. We are hopeless at it. I'm not saying our efforts are totally in vain. However, when I consider the wildly inaccurate probabilities assigned to identified risks, the risks we missed that seem obvious in retrospect and the problems that weren't and couldn't be anticipated, I think we need to throw out current methods and start again.
Taleb offers a new way, one that seems simple at first but will be difficult to implement. Essentially, instead of concentrating on identifying, evaluating, anticipating and mitigating specific risks, we should concentrate our efforts on making business systems "antifragile". He divides the world into three simple categories, fragile, robust and antifragile.
Fragile systems are those that are vulnerable to widely unanticipated events. The vulnerability of Wall Street to bundled mortgage securities erroneously rated AAA is an important example.
Then there are robust systems which successfully mitigate identifiable risk and are robust or resilient to shocks, but are still vulnerable to Taleb's "black swan" events. Those are events entirely unexpected due to confidence in the continuation of patterns and behavior that have always proved true in the past. His example is the Christmas turkey. The farmer houses and feeds the turkey from its birth. They turkey thinks, "This is great, everything is provided for me for no work. I'm on easy street."
The turkey relies on its expectation that every day will be like the one before. The day the butcher arrives is the turkey's black swan. Nothing in its previous life had prepared it for this possibility.
Antifragile systems are designed to benefit from unlikely, uncertain or unpredictable "black swan" events. An example is his investment portfolio strategy where he holds options that mitigate the effects of economic crises. These options are undervalued by the market due to those factors that cause humans to massively over or underestimate risk. Holding the options does not greatly effect the normal returns of the orthodox investments. However, if a disaster occurs, the portfolio actually benefits from uncertainty. Returns are larger.
Taleb identifies real-world examples of "optionality", actions that strengthen systems in an analogous way to his use of financial derivative options. Over the coming months it is my intention to identify some of the ways we can apply this thinking to business projects and systems.