In the past few years I have taken a keen interest in the literature of uncertainty and randomness. James Gleick's classic Chaos sparked the interest 20 years ago but the work of philosopher and polymath Nassim Nicholas Taleb has re-kindled my curiosity in recent years. The popular books of mathematician Leonard Mlodinow, gambler and political forecaster Nate Silver and behavioral psychologist Philip Tetlock have also proved fascinating.
Once you start to understand uncertainty and randomness you begin to see its effects everywhere in every aspect of life. We each play a role in so many complex systems we are often blind to the effects of random chance. But it is constantly present.
The crux of what I've learned and how it relates to work life is that humans are terrible at predicting and anticipating how complex systems will behave in the future. This is a huge problem, given we constantly rely on predictions and expectations of what the future holds. It also means what I had long suspected, that our ability to evaluate and manage risk is fundamentally compromised.
The concept of risk management has a long history in many industries and professions. Even if it is not specifically identified and documented in risk management plans, we all spend time managing risk.
I've been involved in creating, maintaining and managing project risk for most of my career. I've had many opportunities to see others performing these tasks also. And my observation and reflection lead me to agree with the literary evidence. We are hopeless at it. I'm not saying our efforts are totally in vain. However, when I consider the wildly inaccurate probabilities assigned to identified risks, the risks we missed that seem obvious in retrospect and the problems that weren't and couldn't be anticipated, I think we need to throw out current methods and start again.
Taleb offers a new way, one that seems simple at first but will be difficult to implement. Essentially, instead of concentrating on identifying, evaluating, anticipating and mitigating specific risks, we should concentrate our efforts on making business systems "antifragile". He divides the world into three simple categories, fragile, robust and antifragile.
Fragile systems are those that are vulnerable to widely unanticipated events. The vulnerability of Wall Street to bundled mortgage securities erroneously rated AAA is an important example.
Then there are robust systems which successfully mitigate identifiable risk and are robust or resilient to shocks, but are still vulnerable to Taleb's "black swan" events. Those are events entirely unexpected due to confidence in the continuation of patterns and behavior that have always proved true in the past. His example is the Christmas turkey. The farmer houses and feeds the turkey from its birth. They turkey thinks, "This is great, everything is provided for me for no work. I'm on easy street."
The turkey relies on its expectation that every day will be like the one before. The day the butcher arrives is the turkey's black swan. Nothing in its previous life had prepared it for this possibility.
Antifragile systems are designed to benefit from unlikely, uncertain or unpredictable "black swan" events. An example is his investment portfolio strategy where he holds options that mitigate the effects of economic crises. These options are undervalued by the market due to those factors that cause humans to massively over or underestimate risk. Holding the options does not greatly effect the normal returns of the orthodox investments. However, if a disaster occurs, the portfolio actually benefits from uncertainty. Returns are larger.
Taleb identifies real-world examples of "optionality", actions that strengthen systems in an analogous way to his use of financial derivative options. Over the coming months it is my intention to identify some of the ways we can apply this thinking to business projects and systems.
21 July 2014
08 July 2014
Plain English please (First part of many)
Using buzz-words
appropriated from other domains incorrectly is about the worst thing
that a manager or public figure can do. This is guaranteed to make
smart people think you are stupid. Two examples I often hear are the
word “quantum” and the phrase “sovereign risk”.
“Quantum” does
not mean “total”. It is the plural of “quanta”, meaning the
most basic unit a thing consists of. Outside physics, it most common
use is in law, applied to compensation amounts derived from the
application of several different legal axioms. In context, the word's
subject is some or all of the multiple amounts, not their total
amount. This is an important distinction.
When a CEO stands up
at an AGM and starts their address with, “We confidently forecast
the quantum of revenue next year will be 20% higher than this year”,
without any context of the units that make up the revenue, that man
or woman is using a big word to try and impress people. And failing
dismally.
“Sovereign risk”
is appropriated from the finance industry where it means the
probability of a state defaulting on interest payments on its
treasury bonds. The phrase entered common parlance in the 2011
European debt crisis where Greece and a few other nations came close
to default. How many Australian pundits, politicians and executives
now use the phrase to mean any uncertainty in proposed government
action or to simply criticise a policy they don't like? I've heard
lots of 'em on ABC's “The Business” alone. And either consciously
or unconsciously the purpose is to allude to the chaos in the streets of
Athens we all witnessed on TV.
How does this sound
to people where this phrase still has its specific finance domain
meaning? I think its likely to lower confidence in Australian
economic stability. That can't be a good thing.
When you use
technical words incorrectly and appropriate phrases out of context,
you can do a lot of damage to your own reputation and produce all
sorts of unintended and unpredictable effects. And smart people will
think you're an idiot.
Labels:
Australian politics,
business,
language,
plain English,
wrongheadedness
Subscribe to:
Posts (Atom)