By Nassim Nicholas Taleb
Simply wow!... a Master Piece.
Nassim Nicholas Taleb, although insuferable and pretentious in his writing, and besides making The Black Swan the kind of difficult read, is still full of interesting, provocative and mind-engaging thoughts that are better to be read twice.
Simply stated; a Black Swan is a random and improbable event that has the power to cause massive consequences. These events can take place in history, society, economics, finance and many other fields. Taleb goes on to explain that forecasters, analysts and general public is blind to these events due to a set of fallacies that impair our judgement.
The Black Swan is proving to be challenging to digest and understand, yet I added this book for my personal library since it is one of the favorites of one of my all-time heroes, Jeff Bezos, and there is a reason why he made this book a mandatory read for his Amazon executives.
Amazon Page for details
My Rating: 9/10
Click Here to Read My Notes
Simply wow!... a Master Piece.
Nassim Nicholas Taleb, although insuferable and pretentious in his writing, and besides making The Black Swan the kind of difficult read, is still full of interesting, provocative and mind-engaging thoughts that are better to be read twice.
Simply stated; a Black Swan is a random and improbable event that has the power to cause massive consequences. These events can take place in history, society, economics, finance and many other fields. Taleb goes on to explain that forecasters, analysts and general public is blind to these events due to a set of fallacies that impair our judgement.
The Black Swan is proving to be challenging to digest and understand, yet I added this book for my personal library since it is one of the favorites of one of my all-time heroes, Jeff Bezos, and there is a reason why he made this book a mandatory read for his Amazon executives.
Amazon Page for details
My Rating: 9/10
Click Here to Read My Notes
So I stayed in the quant and trading businesses (I’m still there), but organized myself to do minimal but intense (and entertaining) work, focus only on the most technical aspects, never attend business “meetings,” avoid the company of “achievers” and people in suits who don’t read books, and take a sabbatical year for every three on average to fill up gaps in my scientific and philosophical culture. To slowly distill my single idea, I wanted to become a flâneur, a professional meditator, sit in cafés, lounge, unglued to desks and organization structures, sleep as long as I needed, read voraciously, and not owe any explanation to anybody. I wanted to be left alone in order to build, small steps at a time, an entire system of thought based on my Black Swan idea.
When people at cocktail parties asked me what I did for a living, I was tempted to answer, “I am a skeptical empiricist and a flâneur-reader, someone committed to getting very deep into an idea,” but I made things simple by saying that I was a limousine driver.
If I myself had to give advice, I would recommend someone pick a profession that is not scalable! A scalable profession is good only if you are successful; they are more competitive, produce monstrous inequalities, and are far more random, with huge disparities between efforts and rewards—a few can take a large share of the pie, leaving others out entirely at no fault of their own.
The success of movies depends severely on contagions. Such contagions do not just apply to the movies: they seem to affect a wide range of cultural products. It is hard for us to accept that people do not fall in love with works of art only for their own sake, but also in order to feel that they belong to a community. By imitating, we get closer to others—that is, other imitators. It fights solitude.
The American economy has leveraged itself heavily on the idea generation, which explains why losing manufacturing jobs can be coupled with a rising standard of living.
I can state the supreme law of Mediocristan as follows: When your sample is large, no single instance will significantly change the aggregate or the total. The largest observation will remain impressive, but eventually insignificant, to the sum.
In Extremistan, inequalities are such that one single observation can disproportionately impact the aggregate, or the total.
So while weight, height, and calorie consumption are from Mediocristan, wealth is not. Almost all social matters are from Extremistan.
Look at the implication for the Black Swan. Extremistan can produce Black Swans, and does, since a few occurrences have had huge influences on history. This is the main idea of this book.
In Extremistan, one unit can easily affect the total in a disproportionate way. In this world, you should always be suspicious of the knowledge you derive from data.
Mediocristan is where we must endure the tyranny of the collective, the routine, the obvious, and the predicted; Extremistan is where we are subjected to the tyranny of the singular, the accidental, the unseen, and the unpredicted.
From the standpoint of the turkey, the nonfeeding of the one thousand and first day is a Black Swan. For the butcher, it is not, since its occurrence is not unexpected. So you can see here that the Black Swan is a sucker’s problem.
You realize that you can eliminate a Black Swan by science (if you’re able), or by keeping an open mind.
In general, positive Black Swans take time to show their effect while negative ones happen very quickly—it is much easier and much faster to destroy than to build.
Huet, who lived into his nineties, had a servant follow him with a book to read aloud to him during meals and breaks and thus avoid lost time.
Sometimes a lot of data can be meaningless; at other times one single piece of information can be very meaningful.
Among them figure chess grand masters, who, it has been shown, actually do focus on where a speculative move might be weak; rookies, by comparison, look for confirmatory instances instead of falsifying ones.
Scientists believe that it is the search for their own weaknesses that makes them good chess players, not the practice of chess that turns them into skeptics.
Similarly, the speculator George Soros, when making a financial bet, keeps looking for instances that would prove his initial theory wrong.
Both the artistic and scientific enterprises are the product of our need to reduce dimensions and inflict some order on things.
A novel, a story, a myth, or a tale, all have the same function: they spare us from the complexity of the world and shield us from its randomness.
Indeed, many severe psychological disorders accompany the feeling of loss of control of—being able to “make sense” of—one’s environment.
As Stalin, who knew something about the business of mortality, supposedly said, “One death is a tragedy; a million is a statistic.” Statistics stay silent in us.
System 1, the experiential one, is effortless, automatic, fast, opaque (we do not know that we are using it), parallel-processed, and can lend itself to errors. It is what we call “intuition,” and performs these quick acts of prowess that became popular under the name blink, after the title of Malcolm Gladwell’s bestselling book.
System 2, the cogitative one, is what we normally call thinking. It is what you use in a classroom, as it is effortful (even for Frenchmen), reasoned, slow, logical, serial, progressive, and self-aware (you can follow the steps in your reasoning).
Emotions are assumed to be the weapon System 1 uses to direct us and force us to act quickly.
Much of the trouble with human nature resides in our inability to use much of System 2, or to use it in a prolonged way without having to take a long beach vacation.
I’ll conclude by saying that our misunderstanding of the Black Swan can be largely attributed to our using System 1, i.e., narratives, and the sensational—as well as the emotional—which imposes on us a wrong map of the likelihood of events.
The way to avoid the ills of the narrative fallacy is to favor experimentation over storytelling, experience over history, and clinical knowledge over theories.
We favor the sensational and the extremely visible. This affects the way we judge heroes. There is little room in our consciousness for heroes who do not deliver visible results—or those heroes who focus on process rather than results.
The problem of lumpy payoffs is not so much in the lack of income they entail, but the pecking order, the loss of dignity, the subtle humiliations near the watercooler.
So from a narrowly defined accounting point of view, which I may call here “hedonic calculus,” it does not pay to shoot for one large win. Mother Nature destined us to derive enjoyment from a steady flow of pleasant small, but frequent, rewards.
I presented the Black Swan as the outlier, the important event that is not expected to happen. But consider the opposite: the unexpected event that you very badly want to happen.
If you engage in a Black Swan–dependent activity, it is better to be part of a group.
In some strategies and life situations, you gamble dollars to win a succession of pennies while appearing to be winning all the time. In others, you risk a succession of pennies to win dollars. In other words, you bet either that the Black Swan will happen or that it will never happen, two strategies that require completely different mind-sets.
I will rapidly present Nero’s idea. His premise was the following trivial point: some business bets in which one wins big but infrequently, yet loses small but frequently, are worth making if others are suckers for them and if you have the personal and intellectual stamina.
People often accept that a financial strategy with a small chance of success is not necessarily a bad one as long as the success is large enough to justify it.
For a spate of psychological reasons, however, people have difficulty carrying out such a strategy, simply because it requires a combination of belief, a capacity for delayed gratification, and the willingness to be spat upon by clients without blinking.
Against that background of potential blowup disguised as skills, Nero engaged in a strategy that he called “bleed.” You lose steadily, daily, for a long time, except when some event takes place for which you get paid disproportionately well.
Another fallacy in the way we understand events is that of silent evidence. History hides both Black Swans and its Black Swan–generating ability from us.
One Diagoras, a nonbeliever in the gods, was shown painted tablets bearing the portraits of some worshippers who prayed, then survived a subsequent shipwreck. The implication was that praying protects you from drowning. Diagoras asked, “Where were the pictures of those who prayed, then drowned?
We shall call this distortion a bias, i.e., the difference between what you see and what is there.
As drowned worshippers do not write histories of their experiences (it is better to be alive for that), so it is with the losers in history, whether people or ideas.
The entire notion of biography is grounded in the arbitrary ascription of a causal relation between specified traits and subsequent events.
I said that taking a “scalable” profession is not a good idea, simply because there are far too few winners in these professions. Well, these professions produce a large cemetery: the pool of starving actors is larger than the one of starving accountants, even if you assume that, on average, they earn the same income.
Recall the confirmation fallacy: governments are great at telling you what they did, but not what they did not do. In fact, they engage in what could be labeled as phony “philanthropy,” the activity of helping people in a visible and sensational way without taking into account the unseen cemetery of invisible consequences.
Bastiat goes a bit deeper. If both the positive and the negative consequences of an action fell on its author, our learning would be fast. But often an action’s positive consequences benefit only its author, since they are visible, while the negative consequences, being invisible, apply to others, with a net cost to society.
Have the guts to consider the silent consequences when standing in front of the next snake-oil humanitarian.
We have enough evidence to confirm that, indeed, we humans are an extremely lucky species, and that we got the genes of the risk takers. The foolish risk takers, that is. In fact, the Casanovas who survived.
Once again, I am not dismissing the idea of risk taking, having been involved in it myself. I am only critical of the encouragement of uninformed risk taking.
The überpsychologist Danny Kahneman has given us evidence that we generally take risks not out of bravado but out of ignorance and blindness to probability!
But I insist on the following: that we got here by accident does not mean that we should continue to take the same risks
We have been playing Russian roulette; now let’s stop and get a real job.
I repeat that we are explanation-seeking animals who tend to think that everything has an identifiable cause and grab the most apparent one as the explanation.
The main identifiable reason for our survival of such diseases might simply be inaccessible to us: we are here since, Casanova-style, the “rosy” scenario played out, and if it seems too hard to understand it is because we are too brainwashed by notions of causality and we think that it is smarter to say because than to accept randomness.
My biggest problem with the educational system lies precisely in that it forces students to squeeze explanations out of subject matters and shames them for withholding judgment, for uttering the “I don’t know.
Note here that I am not saying causes do not exist; do not use this argument to avoid trying to learn from history. All I am saying is that it is not so simple; be suspicious of the “because” and handle it with care—particularly in situations where you suspect silent evidence.
While the problem is very general, one of its nastiest illusions is what I call the ludic fallacy—the attributes of the uncertainty we face in real life have little connection to the sterilized ones we encounter in exams and games.
And it is why we have Black Swans and never learn from their occurrence, because the ones that did not happen were too abstract.
Alas, we are not manufactured, in our current edition of the human race, to understand abstract matters—we need context. Randomness and uncertainty are abstractions.
We respect what has happened, ignoring what could have happened.
I propose that if you want a simple step to a higher form of life, as distant from the animal as you can get, then you may have to denarrate, that is, shut down the television set, minimize time spent reading newspapers, ignore the blogs. Train your reasoning abilities to control your decisions; nudge System 1 (the heuristic or experiential system) out of the important ones. Train yourself to spot the difference between the sensational and the empirical. This insulation from the toxicity of the world will have an additional benefit: it will improve your well-being.
Also, bear in mind how shallow we are with probability, the mother of all abstract notions. You do not have to do much more in order to gain a deeper understanding of things around you. Above all, learn to avoid “
Prediction, not narration, is the real test of our understanding of the world.
I find it scandalous that in spite of the empirical record we continue to project into the future as if we were good at it, using tools and methods that exclude rare events.
The larger the role of the Black Swan, the harder it will be for us to predict. Sorry.
True, our knowledge does grow, but it is threatened by greater increases in confidence, which make our increase in knowledge at the same time an increase in confusion, ignorance, and conceit.
Epistemic arrogance bears a double effect: we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states (i.e., by reducing the space of the unknown).
I remind the reader that I am not testing how much people know, but assessing the difference between what people actually know and how much they think they know.
The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
The problem is that our ideas are sticky: once we produce a theory, we are not likely to change our minds—so those who delay developing their theories are better off.
Two mechanisms are at play here: the confirmation bias that we saw in Chapter 5, and belief perseverance, the tendency not to reverse opinions you already have. Remember that we treat ideas like possessions, and it will be hard for us to part with them.
Remember that we are swayed by the sensational. Listening to the news on the radio every hour is far worse for you than reading a weekly magazine, because the longer interval allows information to be filtered a bit.
I’ve struggled much of my life with the common middlebrow belief that “more is better”—more is sometimes, but not always, better.
No matter what anyone tells you, it is a good idea to question the error rate of an expert’s procedure.
Do not question his procedure, only his confidence.
There are some professions in which you know more than the experts, who are, alas, people for whose opinions you are paying—instead of them paying you to listen to them. Which ones?
On one hand, we are shown by a class of expert-busting researchers such as Paul Meehl and Robyn Dawes that the “expert” is the closest thing to a fraud, performing no better than a computer using a single metric, their intuition getting in the way and blinding them.
On the other hand, there is abundant literature showing that many people can beat computers thanks to their intuition. Which one is correct?
Would you rather have your upcoming brain surgery performed by a newspaper’s science reporter or by a certified brain surgeon? On the other hand, would you prefer to listen to an economic forecast by someone with a PhD in finance from some “prominent” institution such as the Wharton School, or by a newspaper’s business writer? While the answer to the first question is empirically obvious, the answer to the second one isn’t at all. We can already see the difference between “know-how” and “know-what.
The Greeks made a distinction between technē and epistēmē. The empirical school of medicine of Menodotus of Nicomedia and Heraclites of Tarentum wanted its practitioners to stay closest to technē (i.e., “craft”), and away from epistēmē (i.e., “knowledge,” “
Simply, things that move, and therefore require knowledge, do not usually have experts, while things that don’t move seem to have some experts.
In other words, professions that deal with the future and base their studies on the nonrepeatable past have an expert problem
Another way to see it is that things that move are often Black Swan
Our predictors may be good at predicting the ordinary, but not the irregular, and this is where they ultimately fail.
Contrary to what people might expect, I am not recommending that anyone become a hedgehog—rather, be a fox with an open mind.
Makridakis and Hibon were to find out that the strong empirical evidence of their studies has been ignored by theoretical statisticians. Furthermore, they encountered shocking hostility toward their empirical verifications. “Instead [statisticians] have concentrated their efforts in building more sophisticated models without regard to the ability of such models to more accurately predict real-life data,” Makridakis and Hibon write.
Economics is perhaps the subject that currently has the highest number of philistine scholars—scholarship without erudition and natural curiosity can close your mind and lead to the fragmentation of disciplines.
In order to survive, institutions may need to give themselves and others the appearance of having a “
The unexpected has a one-sided effect with projects. Consider the track records of builders, paper writers, and contractors. The unexpected almost always pushes in a single direction: higher costs and a longer time to completion.
On very rare occasions, as with the Empire State Building, you get the opposite: shorter completion and lower costs—these occasions are becoming truly exceptional nowadays.
With projects of great novelty, such as a military invasion, an all-out war, or something entirely new, errors explode upward.
As you see, the longer you wait, the longer you will be expected to wait.
The Arab-Israeli conflict is sixty years old, and counting—yet it was considered “a simple problem” sixty years ago. (Always remember that, in a modern environment, wars last longer and kill more people than is typically planned.)
There are those people who produce forecasts uncritically. When asked why they forecast, they answer, “Well, that’s what we’re paid to do here.” My suggestion: get another job.
People who are trapped in their jobs who forecast simply because “that’s my job,” knowing pretty well that their forecast is ineffectual, are not what I would call ethical. What they do is no different from repeating lies simply because “it’s my job.
Anyone who causes harm by forecasting should be treated as either a fool or a liar. Some forecasters cause more damage to society than criminals. Please, don’t drive a school bus blindfolded.
I have said that the Black Swan has three attributes: unpredictability, consequences, and retrospective explainability.
We’ve seen that a) we tend to both tunnel and think “narrowly” (epistemic arrogance), and b) our prediction record is highly overestimated—many people who think they can predict actually can
The managers sat down to brainstorm during these meetings, about, of course, the medium-term future—they wanted to have “vision.” But then an event occurred that was not in the previous five-year plan: the Black Swan of the Russian financial default of 1998 and the accompanying meltdown of the values of Latin American debt markets. It had such an effect on the firm that, although the institution had a sticky employment policy of retaining managers, none of the five was still employed there a month after the sketch of the 1998 five-year plan. Yet I am confident that today their replacements are still meeting to work on the next “five-year plan.” We never learn.
The classical model of discovery is as follows: you search for what you know (say, a new way to reach India) and find something you didn’t know was there (America).
If you think that the inventions we see around us came from someone sitting in a cubicle and concocting them according to a timetable, think again: almost everything of the moment is the product of serendipity.
In other words, you find something you are not looking for and it changes the world, while wondering after its discovery why it “took so long” to arrive at something so obvious.
Sir Francis Bacon commented that the most important advances are the least predictable ones, those “lying out of the path of the imagination.
We forget about unpredictability when it is our turn to predict. This is why people can read this chapter and similar accounts, agree entirely with them, yet fail to heed their arguments when thinking about the future.
True, Fleming was looking for “something,” but the actual discovery was simply serendipitous.
Viagra, which changed the mental outlook and social mores of retired men, was meant to be a hypertension drug. Another hypertension drug led to a hair-growth medication. My friend Bruce Goldberg, who understands randomness, calls these unintended side applications “corners.” While many worry about unintended consequences, technology adventurers thrive on them.
Louis Pasteur’s adage about creating luck by sheer exposure. “Luck favors the prepared,” Pasteur said, and, like all great discoverers, he knew something about accidental discoveries.
The best way to get maximal exposure is to keep researching.
Popper’s central argument is that in order to predict historical events you need to predict technological innovation, itself fundamentally unpredictable.
I’ll summarize my argument here: Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away. Ergo, we do not know what we will know.
To see how our intuitions about these nonlinear multiplicative effects are rather weak, consider this story about the chessboard. The inventor of the chessboard requested the following compensation: one grain of rice for the first square, two for the second, four for the third, eight, then sixteen, and so on, doubling every time, sixty-four times. The king granted this request, thinking that the inventor was asking for a pittance—but he soon realized that he was outsmarted. The amount of rice exceeded all possible grain reserves!
Forecasting the motion of a billiard ball on a pool table requires knowledge of the dynamics of the entire universe, down to every single atom!
We are not predisposed to respect humble people, those who try to suspend judgment.
Now contemplate epistemic humility. Think of someone heavily introspective, tortured by the awareness of his own ignorance. He lacks the courage of the idiot, yet has the rare guts to say “I don’t know.” He does not mind looking like a fool or, worse, an ignoramus. He hesitates, he will not commit, and he agonizes over the consequences of being wrong. He introspects, introspects, and introspects until he reaches physical and nervous exhaustion.
The bottom line: be prepared! Narrow-minded prediction has an analgesic or therapeutic effect. Be aware of the numbing effect of magic numbers. Be prepared for all relevant eventualities.
So the second lesson is more aggressive: you can actually take advantage of the problem of prediction and epistemic arrogance! As a matter of fact, I suspect that the most successful businesses are precisely those that know how to work around inherent unpredictability and even exploit it.
Aside from the movies, examples of positive–Black Swan businesses are: some segments of publishing, scientific research, and venture capital.
Diplomats understand that very well: casual chance discussions at cocktail parties usually lead to big breakthroughs—not dry correspondence or telephone conversations.
The Achilles’ heel of capitalism is that if you make corporations compete, it is sometimes the one that is most exposed to the negative Black Swan that will appear to be the most fit for survival.
Indeed, the notion of asymmetric outcomes is the central idea of this book: I will never get to know the unknown since, by definition, it is unknown. However, I can always guess how it might affect me, and I should base my decisions around that.
I do not know whether God exists, but I know that I have nothing to gain from being an atheist if he does not exist, whereas I have plenty to lose if he does. Hence, this justifies my belief in God.
I don’t know the odds of an earthquake, but I can imagine how San Francisco might be affected by one. This idea that in order to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) is the central idea of uncertainty. Much of my life is based on it.
As I said, if my portfolio is exposed to a market crash, the odds of which I can’t compute, all I have to do is buy insurance, or get out and invest the amounts I am not willing to ever lose in less risky securities.
The more I think about my subject, the more I see evidence that the world we have in our minds is different from the one playing outside.
Every morning the world appears to me more random than it did the day before, and humans seem to be even more fooled by it than they were the previous day. It is becoming unbearable. I find writing these lines painful; I find the world revolting.
In sociology, Matthew effects bear the less literary name “cumulative advantage.” This theory can easily apply to companies, businessmen, actors, writers, and anyone else who benefits from past success. If you get published in The New Yorker because the color of your letterhead attracted the attention of the editor, who was daydreaming of daisies, the resultant reward can follow you for life. More significantly, it will follow others for life. Failure is also cumulative; losers are likely to also lose in the future, even if we don’t take into account the mechanism of demoralization that might exacerbate it and cause additional failure.
The increased concentration among banks seems to have the effect of making financial crisis less likely, but when they happen they are more global in scale and hit us very hard.
I rephrase here: we will have fewer but more severe crises.
Measures of uncertainty that are based on the bell curve simply disregard the possibility, and the impact, of sharp jumps or discontinuities and are, therefore, inapplicable in Extremistan. Using them is like focusing on the grass and missing out on the (gigantic) trees.
Although unpredictable large deviations are rare, they cannot be dismissed as outliers because, cumulatively, their impact is so dramatic.
The traditional Gaussian way of looking at the world begins by focusing on the ordinary, and then deals with exceptions or so-called outliers as ancillaries. But there is a second way, which takes the exceptional as a starting point and treats the ordinary as subordinate.
We can make good use of the Gaussian approach in variables for which there is a rational reason for the largest not to be too far away from the average.
If there is gravity pulling numbers down, or if there are physical limitations preventing very large observations, we end up in Mediocristan.
This is why much of economics is based on the notion of equilibrium: among other benefits, it allows you to treat economic phenomena as Gaussian.
Let me show you how the Gaussian bell curve sucks randomness out of life—which is why it is popular. We like it because it allows certainties! How? Through averaging, as I will discuss next.
when you have plenty of gamblers, no single gambler will impact the total more than minutely. The consequence of this is that variations around the average of the Gaussian, also called “errors,” are not truly worrisome. They are small and they wash out. They are domesticated fluctuations around the mean.
Mountains are not triangles or pyramids; trees are not circles; straight lines are almost never seen anywhere. Mother Nature did not attend high school geometry courses or read the books of Euclid of Alexandria. Her geometry is jagged, but with a logic of its own and one that is easy to understand.
What does fractal geometry have to do with the distribution of wealth, the size of cities, returns in the financial markets, the number of casualties in war, or the size of planets?
The key here is that the fractal has numerical or statistical measures that are (somewhat) preserved across scales
Fractals should be the default, the approximation, the framework. They do not solve the Black Swan problem and do not turn all Black Swans into predictable events, but they significantly mitigate the Black Swan problem by making such large events conceivable.
It is no wonder that we run the biggest risk of all: we handle matters that belong to Extremistan, but treated as if they belonged to Mediocristan, as an “
I hope I’ve sufficiently drilled home the notion that, as a practitioner, my thinking is rooted in the belief that you cannot go from books to problems, but the reverse, from problems to books.
I worry far more about the “promising” stock market, particularly the “safe” blue chip stocks, than I do about speculative ventures—the former present invisible risks, the latter offer no surprises since you know how volatile they are and can limit your downside by investing smaller amounts.
I worry less about embarrassment than about missing an opportunity.
When people at cocktail parties asked me what I did for a living, I was tempted to answer, “I am a skeptical empiricist and a flâneur-reader, someone committed to getting very deep into an idea,” but I made things simple by saying that I was a limousine driver.
If I myself had to give advice, I would recommend someone pick a profession that is not scalable! A scalable profession is good only if you are successful; they are more competitive, produce monstrous inequalities, and are far more random, with huge disparities between efforts and rewards—a few can take a large share of the pie, leaving others out entirely at no fault of their own.
The success of movies depends severely on contagions. Such contagions do not just apply to the movies: they seem to affect a wide range of cultural products. It is hard for us to accept that people do not fall in love with works of art only for their own sake, but also in order to feel that they belong to a community. By imitating, we get closer to others—that is, other imitators. It fights solitude.
The American economy has leveraged itself heavily on the idea generation, which explains why losing manufacturing jobs can be coupled with a rising standard of living.
I can state the supreme law of Mediocristan as follows: When your sample is large, no single instance will significantly change the aggregate or the total. The largest observation will remain impressive, but eventually insignificant, to the sum.
In Extremistan, inequalities are such that one single observation can disproportionately impact the aggregate, or the total.
So while weight, height, and calorie consumption are from Mediocristan, wealth is not. Almost all social matters are from Extremistan.
Look at the implication for the Black Swan. Extremistan can produce Black Swans, and does, since a few occurrences have had huge influences on history. This is the main idea of this book.
In Extremistan, one unit can easily affect the total in a disproportionate way. In this world, you should always be suspicious of the knowledge you derive from data.
Mediocristan is where we must endure the tyranny of the collective, the routine, the obvious, and the predicted; Extremistan is where we are subjected to the tyranny of the singular, the accidental, the unseen, and the unpredicted.
From the standpoint of the turkey, the nonfeeding of the one thousand and first day is a Black Swan. For the butcher, it is not, since its occurrence is not unexpected. So you can see here that the Black Swan is a sucker’s problem.
You realize that you can eliminate a Black Swan by science (if you’re able), or by keeping an open mind.
In general, positive Black Swans take time to show their effect while negative ones happen very quickly—it is much easier and much faster to destroy than to build.
Huet, who lived into his nineties, had a servant follow him with a book to read aloud to him during meals and breaks and thus avoid lost time.
Sometimes a lot of data can be meaningless; at other times one single piece of information can be very meaningful.
Among them figure chess grand masters, who, it has been shown, actually do focus on where a speculative move might be weak; rookies, by comparison, look for confirmatory instances instead of falsifying ones.
Scientists believe that it is the search for their own weaknesses that makes them good chess players, not the practice of chess that turns them into skeptics.
Similarly, the speculator George Soros, when making a financial bet, keeps looking for instances that would prove his initial theory wrong.
Both the artistic and scientific enterprises are the product of our need to reduce dimensions and inflict some order on things.
A novel, a story, a myth, or a tale, all have the same function: they spare us from the complexity of the world and shield us from its randomness.
Indeed, many severe psychological disorders accompany the feeling of loss of control of—being able to “make sense” of—one’s environment.
As Stalin, who knew something about the business of mortality, supposedly said, “One death is a tragedy; a million is a statistic.” Statistics stay silent in us.
System 1, the experiential one, is effortless, automatic, fast, opaque (we do not know that we are using it), parallel-processed, and can lend itself to errors. It is what we call “intuition,” and performs these quick acts of prowess that became popular under the name blink, after the title of Malcolm Gladwell’s bestselling book.
System 2, the cogitative one, is what we normally call thinking. It is what you use in a classroom, as it is effortful (even for Frenchmen), reasoned, slow, logical, serial, progressive, and self-aware (you can follow the steps in your reasoning).
Emotions are assumed to be the weapon System 1 uses to direct us and force us to act quickly.
Much of the trouble with human nature resides in our inability to use much of System 2, or to use it in a prolonged way without having to take a long beach vacation.
I’ll conclude by saying that our misunderstanding of the Black Swan can be largely attributed to our using System 1, i.e., narratives, and the sensational—as well as the emotional—which imposes on us a wrong map of the likelihood of events.
The way to avoid the ills of the narrative fallacy is to favor experimentation over storytelling, experience over history, and clinical knowledge over theories.
We favor the sensational and the extremely visible. This affects the way we judge heroes. There is little room in our consciousness for heroes who do not deliver visible results—or those heroes who focus on process rather than results.
The problem of lumpy payoffs is not so much in the lack of income they entail, but the pecking order, the loss of dignity, the subtle humiliations near the watercooler.
So from a narrowly defined accounting point of view, which I may call here “hedonic calculus,” it does not pay to shoot for one large win. Mother Nature destined us to derive enjoyment from a steady flow of pleasant small, but frequent, rewards.
I presented the Black Swan as the outlier, the important event that is not expected to happen. But consider the opposite: the unexpected event that you very badly want to happen.
If you engage in a Black Swan–dependent activity, it is better to be part of a group.
In some strategies and life situations, you gamble dollars to win a succession of pennies while appearing to be winning all the time. In others, you risk a succession of pennies to win dollars. In other words, you bet either that the Black Swan will happen or that it will never happen, two strategies that require completely different mind-sets.
I will rapidly present Nero’s idea. His premise was the following trivial point: some business bets in which one wins big but infrequently, yet loses small but frequently, are worth making if others are suckers for them and if you have the personal and intellectual stamina.
People often accept that a financial strategy with a small chance of success is not necessarily a bad one as long as the success is large enough to justify it.
For a spate of psychological reasons, however, people have difficulty carrying out such a strategy, simply because it requires a combination of belief, a capacity for delayed gratification, and the willingness to be spat upon by clients without blinking.
Against that background of potential blowup disguised as skills, Nero engaged in a strategy that he called “bleed.” You lose steadily, daily, for a long time, except when some event takes place for which you get paid disproportionately well.
Another fallacy in the way we understand events is that of silent evidence. History hides both Black Swans and its Black Swan–generating ability from us.
One Diagoras, a nonbeliever in the gods, was shown painted tablets bearing the portraits of some worshippers who prayed, then survived a subsequent shipwreck. The implication was that praying protects you from drowning. Diagoras asked, “Where were the pictures of those who prayed, then drowned?
We shall call this distortion a bias, i.e., the difference between what you see and what is there.
As drowned worshippers do not write histories of their experiences (it is better to be alive for that), so it is with the losers in history, whether people or ideas.
The entire notion of biography is grounded in the arbitrary ascription of a causal relation between specified traits and subsequent events.
I said that taking a “scalable” profession is not a good idea, simply because there are far too few winners in these professions. Well, these professions produce a large cemetery: the pool of starving actors is larger than the one of starving accountants, even if you assume that, on average, they earn the same income.
Recall the confirmation fallacy: governments are great at telling you what they did, but not what they did not do. In fact, they engage in what could be labeled as phony “philanthropy,” the activity of helping people in a visible and sensational way without taking into account the unseen cemetery of invisible consequences.
Bastiat goes a bit deeper. If both the positive and the negative consequences of an action fell on its author, our learning would be fast. But often an action’s positive consequences benefit only its author, since they are visible, while the negative consequences, being invisible, apply to others, with a net cost to society.
Have the guts to consider the silent consequences when standing in front of the next snake-oil humanitarian.
We have enough evidence to confirm that, indeed, we humans are an extremely lucky species, and that we got the genes of the risk takers. The foolish risk takers, that is. In fact, the Casanovas who survived.
Once again, I am not dismissing the idea of risk taking, having been involved in it myself. I am only critical of the encouragement of uninformed risk taking.
The überpsychologist Danny Kahneman has given us evidence that we generally take risks not out of bravado but out of ignorance and blindness to probability!
But I insist on the following: that we got here by accident does not mean that we should continue to take the same risks
We have been playing Russian roulette; now let’s stop and get a real job.
I repeat that we are explanation-seeking animals who tend to think that everything has an identifiable cause and grab the most apparent one as the explanation.
The main identifiable reason for our survival of such diseases might simply be inaccessible to us: we are here since, Casanova-style, the “rosy” scenario played out, and if it seems too hard to understand it is because we are too brainwashed by notions of causality and we think that it is smarter to say because than to accept randomness.
My biggest problem with the educational system lies precisely in that it forces students to squeeze explanations out of subject matters and shames them for withholding judgment, for uttering the “I don’t know.
Note here that I am not saying causes do not exist; do not use this argument to avoid trying to learn from history. All I am saying is that it is not so simple; be suspicious of the “because” and handle it with care—particularly in situations where you suspect silent evidence.
While the problem is very general, one of its nastiest illusions is what I call the ludic fallacy—the attributes of the uncertainty we face in real life have little connection to the sterilized ones we encounter in exams and games.
And it is why we have Black Swans and never learn from their occurrence, because the ones that did not happen were too abstract.
Alas, we are not manufactured, in our current edition of the human race, to understand abstract matters—we need context. Randomness and uncertainty are abstractions.
We respect what has happened, ignoring what could have happened.
I propose that if you want a simple step to a higher form of life, as distant from the animal as you can get, then you may have to denarrate, that is, shut down the television set, minimize time spent reading newspapers, ignore the blogs. Train your reasoning abilities to control your decisions; nudge System 1 (the heuristic or experiential system) out of the important ones. Train yourself to spot the difference between the sensational and the empirical. This insulation from the toxicity of the world will have an additional benefit: it will improve your well-being.
Also, bear in mind how shallow we are with probability, the mother of all abstract notions. You do not have to do much more in order to gain a deeper understanding of things around you. Above all, learn to avoid “
Prediction, not narration, is the real test of our understanding of the world.
I find it scandalous that in spite of the empirical record we continue to project into the future as if we were good at it, using tools and methods that exclude rare events.
The larger the role of the Black Swan, the harder it will be for us to predict. Sorry.
True, our knowledge does grow, but it is threatened by greater increases in confidence, which make our increase in knowledge at the same time an increase in confusion, ignorance, and conceit.
Epistemic arrogance bears a double effect: we overestimate what we know, and underestimate uncertainty, by compressing the range of possible uncertain states (i.e., by reducing the space of the unknown).
I remind the reader that I am not testing how much people know, but assessing the difference between what people actually know and how much they think they know.
The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
The problem is that our ideas are sticky: once we produce a theory, we are not likely to change our minds—so those who delay developing their theories are better off.
Two mechanisms are at play here: the confirmation bias that we saw in Chapter 5, and belief perseverance, the tendency not to reverse opinions you already have. Remember that we treat ideas like possessions, and it will be hard for us to part with them.
Remember that we are swayed by the sensational. Listening to the news on the radio every hour is far worse for you than reading a weekly magazine, because the longer interval allows information to be filtered a bit.
I’ve struggled much of my life with the common middlebrow belief that “more is better”—more is sometimes, but not always, better.
No matter what anyone tells you, it is a good idea to question the error rate of an expert’s procedure.
Do not question his procedure, only his confidence.
There are some professions in which you know more than the experts, who are, alas, people for whose opinions you are paying—instead of them paying you to listen to them. Which ones?
On one hand, we are shown by a class of expert-busting researchers such as Paul Meehl and Robyn Dawes that the “expert” is the closest thing to a fraud, performing no better than a computer using a single metric, their intuition getting in the way and blinding them.
On the other hand, there is abundant literature showing that many people can beat computers thanks to their intuition. Which one is correct?
Would you rather have your upcoming brain surgery performed by a newspaper’s science reporter or by a certified brain surgeon? On the other hand, would you prefer to listen to an economic forecast by someone with a PhD in finance from some “prominent” institution such as the Wharton School, or by a newspaper’s business writer? While the answer to the first question is empirically obvious, the answer to the second one isn’t at all. We can already see the difference between “know-how” and “know-what.
The Greeks made a distinction between technē and epistēmē. The empirical school of medicine of Menodotus of Nicomedia and Heraclites of Tarentum wanted its practitioners to stay closest to technē (i.e., “craft”), and away from epistēmē (i.e., “knowledge,” “
Simply, things that move, and therefore require knowledge, do not usually have experts, while things that don’t move seem to have some experts.
In other words, professions that deal with the future and base their studies on the nonrepeatable past have an expert problem
Another way to see it is that things that move are often Black Swan
Our predictors may be good at predicting the ordinary, but not the irregular, and this is where they ultimately fail.
Contrary to what people might expect, I am not recommending that anyone become a hedgehog—rather, be a fox with an open mind.
Makridakis and Hibon were to find out that the strong empirical evidence of their studies has been ignored by theoretical statisticians. Furthermore, they encountered shocking hostility toward their empirical verifications. “Instead [statisticians] have concentrated their efforts in building more sophisticated models without regard to the ability of such models to more accurately predict real-life data,” Makridakis and Hibon write.
Economics is perhaps the subject that currently has the highest number of philistine scholars—scholarship without erudition and natural curiosity can close your mind and lead to the fragmentation of disciplines.
In order to survive, institutions may need to give themselves and others the appearance of having a “
The unexpected has a one-sided effect with projects. Consider the track records of builders, paper writers, and contractors. The unexpected almost always pushes in a single direction: higher costs and a longer time to completion.
On very rare occasions, as with the Empire State Building, you get the opposite: shorter completion and lower costs—these occasions are becoming truly exceptional nowadays.
With projects of great novelty, such as a military invasion, an all-out war, or something entirely new, errors explode upward.
As you see, the longer you wait, the longer you will be expected to wait.
The Arab-Israeli conflict is sixty years old, and counting—yet it was considered “a simple problem” sixty years ago. (Always remember that, in a modern environment, wars last longer and kill more people than is typically planned.)
There are those people who produce forecasts uncritically. When asked why they forecast, they answer, “Well, that’s what we’re paid to do here.” My suggestion: get another job.
People who are trapped in their jobs who forecast simply because “that’s my job,” knowing pretty well that their forecast is ineffectual, are not what I would call ethical. What they do is no different from repeating lies simply because “it’s my job.
Anyone who causes harm by forecasting should be treated as either a fool or a liar. Some forecasters cause more damage to society than criminals. Please, don’t drive a school bus blindfolded.
I have said that the Black Swan has three attributes: unpredictability, consequences, and retrospective explainability.
We’ve seen that a) we tend to both tunnel and think “narrowly” (epistemic arrogance), and b) our prediction record is highly overestimated—many people who think they can predict actually can
The managers sat down to brainstorm during these meetings, about, of course, the medium-term future—they wanted to have “vision.” But then an event occurred that was not in the previous five-year plan: the Black Swan of the Russian financial default of 1998 and the accompanying meltdown of the values of Latin American debt markets. It had such an effect on the firm that, although the institution had a sticky employment policy of retaining managers, none of the five was still employed there a month after the sketch of the 1998 five-year plan. Yet I am confident that today their replacements are still meeting to work on the next “five-year plan.” We never learn.
The classical model of discovery is as follows: you search for what you know (say, a new way to reach India) and find something you didn’t know was there (America).
If you think that the inventions we see around us came from someone sitting in a cubicle and concocting them according to a timetable, think again: almost everything of the moment is the product of serendipity.
In other words, you find something you are not looking for and it changes the world, while wondering after its discovery why it “took so long” to arrive at something so obvious.
Sir Francis Bacon commented that the most important advances are the least predictable ones, those “lying out of the path of the imagination.
We forget about unpredictability when it is our turn to predict. This is why people can read this chapter and similar accounts, agree entirely with them, yet fail to heed their arguments when thinking about the future.
True, Fleming was looking for “something,” but the actual discovery was simply serendipitous.
Viagra, which changed the mental outlook and social mores of retired men, was meant to be a hypertension drug. Another hypertension drug led to a hair-growth medication. My friend Bruce Goldberg, who understands randomness, calls these unintended side applications “corners.” While many worry about unintended consequences, technology adventurers thrive on them.
Louis Pasteur’s adage about creating luck by sheer exposure. “Luck favors the prepared,” Pasteur said, and, like all great discoverers, he knew something about accidental discoveries.
The best way to get maximal exposure is to keep researching.
Popper’s central argument is that in order to predict historical events you need to predict technological innovation, itself fundamentally unpredictable.
I’ll summarize my argument here: Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away. Ergo, we do not know what we will know.
To see how our intuitions about these nonlinear multiplicative effects are rather weak, consider this story about the chessboard. The inventor of the chessboard requested the following compensation: one grain of rice for the first square, two for the second, four for the third, eight, then sixteen, and so on, doubling every time, sixty-four times. The king granted this request, thinking that the inventor was asking for a pittance—but he soon realized that he was outsmarted. The amount of rice exceeded all possible grain reserves!
Forecasting the motion of a billiard ball on a pool table requires knowledge of the dynamics of the entire universe, down to every single atom!
We are not predisposed to respect humble people, those who try to suspend judgment.
Now contemplate epistemic humility. Think of someone heavily introspective, tortured by the awareness of his own ignorance. He lacks the courage of the idiot, yet has the rare guts to say “I don’t know.” He does not mind looking like a fool or, worse, an ignoramus. He hesitates, he will not commit, and he agonizes over the consequences of being wrong. He introspects, introspects, and introspects until he reaches physical and nervous exhaustion.
The bottom line: be prepared! Narrow-minded prediction has an analgesic or therapeutic effect. Be aware of the numbing effect of magic numbers. Be prepared for all relevant eventualities.
So the second lesson is more aggressive: you can actually take advantage of the problem of prediction and epistemic arrogance! As a matter of fact, I suspect that the most successful businesses are precisely those that know how to work around inherent unpredictability and even exploit it.
Aside from the movies, examples of positive–Black Swan businesses are: some segments of publishing, scientific research, and venture capital.
Diplomats understand that very well: casual chance discussions at cocktail parties usually lead to big breakthroughs—not dry correspondence or telephone conversations.
The Achilles’ heel of capitalism is that if you make corporations compete, it is sometimes the one that is most exposed to the negative Black Swan that will appear to be the most fit for survival.
Indeed, the notion of asymmetric outcomes is the central idea of this book: I will never get to know the unknown since, by definition, it is unknown. However, I can always guess how it might affect me, and I should base my decisions around that.
I do not know whether God exists, but I know that I have nothing to gain from being an atheist if he does not exist, whereas I have plenty to lose if he does. Hence, this justifies my belief in God.
I don’t know the odds of an earthquake, but I can imagine how San Francisco might be affected by one. This idea that in order to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) is the central idea of uncertainty. Much of my life is based on it.
As I said, if my portfolio is exposed to a market crash, the odds of which I can’t compute, all I have to do is buy insurance, or get out and invest the amounts I am not willing to ever lose in less risky securities.
The more I think about my subject, the more I see evidence that the world we have in our minds is different from the one playing outside.
Every morning the world appears to me more random than it did the day before, and humans seem to be even more fooled by it than they were the previous day. It is becoming unbearable. I find writing these lines painful; I find the world revolting.
In sociology, Matthew effects bear the less literary name “cumulative advantage.” This theory can easily apply to companies, businessmen, actors, writers, and anyone else who benefits from past success. If you get published in The New Yorker because the color of your letterhead attracted the attention of the editor, who was daydreaming of daisies, the resultant reward can follow you for life. More significantly, it will follow others for life. Failure is also cumulative; losers are likely to also lose in the future, even if we don’t take into account the mechanism of demoralization that might exacerbate it and cause additional failure.
The increased concentration among banks seems to have the effect of making financial crisis less likely, but when they happen they are more global in scale and hit us very hard.
I rephrase here: we will have fewer but more severe crises.
Measures of uncertainty that are based on the bell curve simply disregard the possibility, and the impact, of sharp jumps or discontinuities and are, therefore, inapplicable in Extremistan. Using them is like focusing on the grass and missing out on the (gigantic) trees.
Although unpredictable large deviations are rare, they cannot be dismissed as outliers because, cumulatively, their impact is so dramatic.
The traditional Gaussian way of looking at the world begins by focusing on the ordinary, and then deals with exceptions or so-called outliers as ancillaries. But there is a second way, which takes the exceptional as a starting point and treats the ordinary as subordinate.
We can make good use of the Gaussian approach in variables for which there is a rational reason for the largest not to be too far away from the average.
If there is gravity pulling numbers down, or if there are physical limitations preventing very large observations, we end up in Mediocristan.
This is why much of economics is based on the notion of equilibrium: among other benefits, it allows you to treat economic phenomena as Gaussian.
Let me show you how the Gaussian bell curve sucks randomness out of life—which is why it is popular. We like it because it allows certainties! How? Through averaging, as I will discuss next.
when you have plenty of gamblers, no single gambler will impact the total more than minutely. The consequence of this is that variations around the average of the Gaussian, also called “errors,” are not truly worrisome. They are small and they wash out. They are domesticated fluctuations around the mean.
Mountains are not triangles or pyramids; trees are not circles; straight lines are almost never seen anywhere. Mother Nature did not attend high school geometry courses or read the books of Euclid of Alexandria. Her geometry is jagged, but with a logic of its own and one that is easy to understand.
What does fractal geometry have to do with the distribution of wealth, the size of cities, returns in the financial markets, the number of casualties in war, or the size of planets?
The key here is that the fractal has numerical or statistical measures that are (somewhat) preserved across scales
Fractals should be the default, the approximation, the framework. They do not solve the Black Swan problem and do not turn all Black Swans into predictable events, but they significantly mitigate the Black Swan problem by making such large events conceivable.
It is no wonder that we run the biggest risk of all: we handle matters that belong to Extremistan, but treated as if they belonged to Mediocristan, as an “
I hope I’ve sufficiently drilled home the notion that, as a practitioner, my thinking is rooted in the belief that you cannot go from books to problems, but the reverse, from problems to books.
I worry far more about the “promising” stock market, particularly the “safe” blue chip stocks, than I do about speculative ventures—the former present invisible risks, the latter offer no surprises since you know how volatile they are and can limit your downside by investing smaller amounts.
I worry less about embarrassment than about missing an opportunity.