Black swan theory
The black swan theory or theory of black swan events is a metaphor that describes an event that comes as a surprise, has a major effect, and is often inappropriately rationalized after the fact with the benefit of hindsight.
The theory was developed by Nassim Nicholas Taleb to explain:
- The disproportionate role of high-profile, hard-to-predict, and rare events that are beyond the realm of normal expectations in history, science, finance, and technology.
- The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities).
- The psychological biases that blind people, individually and collectively, to uncertainty and to a rare event's massive role in historical affairs.
Unlike the earlier philosophical "black swan problem", the "black swan theory" refers only to unexpected events of large magnitude and consequence and their dominant role in history. Such events, considered extreme outliers, collectively play vastly larger roles than regular occurrences.[1] More technically, in the scientific monograph Silent Risk , Taleb mathematically defines the black swan problem as "stemming from the use of degenerate metaprobability".[2]
Background
Black swan events were introduced by Nassim Nicholas Taleb in his 2001 book Fooled By Randomness, which concerned financial events. His 2007 book The Black Swan extended the metaphor to events outside of financial markets. Taleb regards almost all major scientific discoveries, historical events, and artistic accomplishments as "black swans"—undirected and unpredicted. He gives the rise of the Internet, the personal computer, World War I, dissolution of the Soviet Union, and the September 2001 attacks as examples of black swan events.[3]
The phrase "black swan" derives from a Latin expression; its oldest known occurrence is the poet Juvenal's characterization of something being "rara avis in terris nigroque simillima cygno" ("a rare bird in the lands and very much like a black swan"; 6.165).[4] When the phrase was coined, the black swan was presumed not to exist. The importance of the metaphor lies in its analogy to the fragility of any system of thought. A set of conclusions is potentially undone once any of its fundamental postulates is disproved. In this case, the observation of a single black swan would be the undoing of the logic of any system of thought, as well as any reasoning that followed from that underlying logic.
Juvenal's phrase was a common expression in 16th century London as a statement of impossibility. The London expression derives from the Old World presumption that all swans must be white because all historical records of swans reported that they had white feathers.[5] In that context, a black swan was impossible or at least nonexistent. After Dutch explorer Willem de Vlamingh discovered black swans in Western Australia in 1697,[6] the term metamorphosed to connote that a perceived impossibility might later be disproven. Taleb notes that in the 19th century John Stuart Mill used the black swan logical fallacy as a new term to identify falsification.[7]
Taleb asserts:[8]
What we call here a Black Swan (and capitalize it) is an event with the following three attributes.First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme 'impact'. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.
I stop and summarize the triplet: rarity, extreme 'impact', and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.
Identifying a black swan event
Based on the author's criteria:
- The event is a surprise (to the observer).
- The event has a major effect.
- After the first recorded instance of the event, it is rationalized by hindsight, as if it could have been expected; that is, the relevant data were available but unaccounted for in risk mitigation programs. The same is true for the personal perception by individuals.
Coping with black swan events
The main idea in Taleb's book is not to attempt to predict black swan events, but to build robustness against negative ones that occur and be able to exploit positive ones. Taleb contends that banks and trading firms are very vulnerable to hazardous black swan events and are exposed to unpredictable losses. On the subject of business in particular, Taleb is highly critical of the widespread use of the normal distribution model as the basis for calculating risk. For example, a paper produced by academics from Oxford University and based on data from 1,471 IT projects showed that although the average cost overrun was only 27%, one in six of the projects had a cost overrun of 200% and a schedule overrun of almost 70%.[9]
In the second edition of The Black Swan, Taleb provides "Ten Principles for a Black-Swan-Robust Society".[10]
Taleb states that a black swan event depends on the observer. For example, what may be a black swan surprise for a turkey is not a black swan surprise to its butcher; hence the objective should be to "avoid being the turkey" by identifying areas of vulnerability in order to "turn the Black Swans white".[11]
Epistemological approach
Taleb's black swan is different from the earlier philosophical versions of the problem, specifically in epistemology, as it concerns a phenomenon with specific empirical and statistical properties which he calls, "the fourth quadrant".[12]
Taleb's problem is about epistemic limitations in some parts of the areas covered in decision making. These limitations are twofold: philosophical (mathematical) and empirical (human known epistemic biases). The philosophical problem is about the decrease in knowledge when it comes to rare events as these are not visible in past samples and therefore require a strong a priori, or an extrapolating theory; accordingly predictions of events depend more and more on theories when their probability is small. In the fourth quadrant, knowledge is uncertain and consequences are large, requiring more robustness.
According to Taleb,[13] thinkers who came before him who dealt with the notion of the improbable, such as Hume, Mill, and Popper focused on the problem of induction in logic, specifically, that of drawing general conclusions from specific observations. The central and unique attribute of Taleb's black swan event is high profile. His claim is that almost all consequential events in history come from the unexpected — yet humans later convince themselves that these events are explainable in hindsight.
One problem, labeled the ludic fallacy by Taleb, is the belief that the unstructured randomness found in life resembles the structured randomness found in games. This stems from the assumption that the unexpected may be predicted by extrapolating from variations in statistics based on past observations, especially when these statistics are presumed to represent samples from a normal distribution. These concerns often are highly relevant in financial markets, where major players sometimes assume normal distributions when using value at risk models, although market returns typically have fat tail distributions.
Taleb said "I don't particularly care about the usual. If you want to get an idea of a friend's temperament, ethics, and personal elegance, you need to look at him under the tests of severe circumstances, not under the regular rosy glow of daily life. Can you assess the danger a criminal poses by examining only what he does on an ordinary day? Can we understand health without considering wild diseases and epidemics? Indeed the normal is often irrelevant. Almost everything in social life is produced by rare but consequential shocks and jumps; all the while almost everything studied about social life focuses on the "normal," particularly with "bell curve" methods of inference that tell you close to nothing. Why? Because the bell curve ignores large deviations, cannot handle them, yet makes us confident that we have tamed uncertainty. Its nickname in this book is GIF, Great Intellectual Fraud."
More generally, decision theory, based on a fixed universe or a model of possible outcomes, ignores and minimizes the effect of events that are "outside model". For instance, a simple model of daily stock market returns may include extreme moves such as Black Monday (1987), but might not model the breakdown of markets following the 9/11 attacks. A fixed model considers the "known unknowns", but ignores the "unknown unknowns". The famous statement of Donald Rumsfeld[14] in a DoD press briefing 2002 is said to have been inspired by a presentation of Taleb in the DoD shortly before. Taleb's 2001 book Fooled By Randomness was about financial events, but had already introduced the Black Swan concept.[15][16][17]
Taleb notes that other distributions are not usable with precision, but often are more descriptive, such as the fractal, power law, or scalable distributions and that awareness of these might help to temper expectations.[18]
Beyond this, he emphasizes that many events simply are without precedent, undercutting the basis of this type of reasoning altogether.
Taleb also argues for the use of counterfactual reasoning when considering risk.[19][20]
See also
- Butterfly effect
- Deus ex machina
- Elephant in the room
- Extreme risk
- Hindsight bias
- Holy grail distribution
- Kurtosis risk
- List of cognitive biases
- Miracle
- Normal accidents
- Normalcy bias
- Outside Context Problem
- Global catastrophic risks
- Quasi-empiricism in mathematics
- Randomness
- Rare events
- Taleb distribution
- The Long Tail
- Uncertainty
- Technological singularity
- Wild card (foresight)
Books by Taleb
- Antifragile: Things That Gain from Disorder
- Fooled by Randomness
- The Black Swan
- Dynamic Hedging – Managing vanilla and Exotic options
- The Bed of Procrustes: Philosophical and Practical Aphorisms
References
- ↑ Taleb 2010, p. xxi.
- ↑ http://www.fooledbyrandomness.com/FatTails.html
- ↑ Taleb 2010.
- ↑ JSTOR 294875
- ↑ "Opacity". Fooled by randomness. Retrieved 2011-10-17.
- ↑ "Black Swan Unique to Western Australia", Parliament, AU: Curriculum, archived from the original on 2011-03-01.
- ↑ Hammond, Peter (October 2009), WERI Bulletin (1), UK: Warwick .
- ↑ "The Black Swan: The Impact of the Highly Improbable". The New York Times. 22 April 2007.
- ↑ Bent Flyvbjerg & Alexander Budzier, September 2011, "Why Your IT Project May Be Riskier Than You Think", Harvard Business Review, vol 89, number 9, pp 601-603
- ↑ Taleb 2010, pp. 374–78.
- ↑ Webb, Allen (December 2008). "Taking improbable events seriously: An interview with the author of The Black Swan (Corporate Finance)" (INTERVIEW). McKinsey Quarterly. McKinsey. p. 3. Retrieved 23 May 2012.
Taleb: In fact, I tried in The Black Swan to turn a lot of black swans white! That’s why I kept going on and on against financial theories, financial-risk managers, and people who do quantitative finance.
- ↑ Taleb 2008.
- ↑ Taleb, Nassim Nicholas (April 2007). The Black Swan: The Impact of the Highly Improbable (1st ed.). London: Penguin. p. 400. ISBN 1-84614045-5. Retrieved 23 May 2012.
- ↑ DoD News Briefing - Secretary Rumsfeld and Gen. Myer, February 12, 2002 11:30 AM EDT
- ↑ Days that shook the world, Oliver Burkeman, book review in The Guardian 2007
- ↑ A Point of View: See no evil 10 January 2014
- ↑ Kursbuch 180: Nicht wissen (not knowing (sic!)), Armin Nassehi, Peter Felixberger Murmann Verlag DE, 02.12.2014
- ↑ Gelman, Andrew (April 2007). "Nassim Taleb’s "The Black Swan"". Statistical Modeling, Causal Inference, and Social Science. Columbia University. Retrieved 23 May 2012.
- ↑ Taleb, Nassim Nicholas (22 April 2007), "The Black Swan", The New York Times
|chapter=
ignored (help) - ↑ Gangahar, Anuj (16 April 2008). "Market Risk: Mispriced risk tests market faith in a prized formula". The Financial Times. New York. Archived from the original on 20 April 2008. Retrieved 23 May 2012.
Bibliography
- Taleb, Nassim Nicholas (2010) [2007], The Black Swan: the impact of the highly improbable (2nd ed.), London: Penguin, ISBN 978-0-14103459-1, retrieved 23 May 2012.
- ——— (September 2008), "The Fourth Quadrant: A Map of the Limits of Statistics", Third Culture (The Edge Foundation), retrieved 23 May 2012.
- ————; Alexander Denev (January 2014), Portfolio Management under Stress, Cambridge University Press, retrieved 1 January 2014.
External links
- Ten Principles for a Black Swan Robust World (PDF), Fooled by randomness.
- David, Dr. Gil, Black Swans in the Cyber Domain, Israel defense.
- Black Swan Stocks Could Make Your Portfolio a Turkey, CNBC.
|