Okay, this book is one that I believe every person should read. If you want to read this book, and you don't have access to the book from your library, in paper, digital, or audiobook format, and I know you some way, I will loan you my copy or buy you a copy. If you
arewere my older brother, I will express ship this book to you, as I believe you would benefit greatly from this book.
Taleb talks about how statistics lie, but specifically how events so far outside of the normal, or our experience, cannot be predicted. He talks about how the Black Swan events, those rare experiences that can't be predicted, demonstrate how
And he goes into a number of logic fallacies that everyone should know, but really most people don't. He shows how even when we think we're aware of them, we often aren't. Which really means we're human. And fallable.
One of the features of this book that I found annoying was the self-references to "this book." I'm not a fan of the "In this book, I am going to describe" style of writing, or the "hey, I'm going to mention this thing, but not talk about it until later" way of introducing related topics. It's how this book is written, and while I find it annoying, once I accepted it (after the second occurance), it was fine.
Again, strongly recommend, let me buy you a copy of, this book.
The central idea of this book concerns our blindness with respect to randomness, particularly the large deviations: Why do we, scientists or nonscientists, hotshots or regular Joes, tend to see the pennies instead of the dollars? Why do we keep focusing on the minutiae, not the possible significant large events, in spite of the obvious evidence of their huge influence?
See? Even before the prologue is fully underway, we have "this book." I am giggling at it already.
Isn’t it strange to see an event happening precisely because it was not supposed to happen?
Think about the “secret recipe” to making a killing in the restaurant business. If it were known and obvious, then someone next door would have already come up with the idea and it would have become generic.
Taleb is talking about how we justify things looking backward, a hindsight fallacy of sources.
Consider the Indian Ocean tsunami of December 2004. Had it been expected, it would not have caused the damage it did— the areas affected would have been less populated, an early warning system would have been put in place. What you know cannot really hurt you.
What is surprising is not the magnitude of our forecast errors, but our absence of awareness of it. This is all the more worrisome when we engage in deadly conflicts: wars are fundamentally unpredictable (and we do not know it).
And BOOM. This.
We will see that, contrary to social-science wisdom, almost no discovery, no technologies of note, came from design and planning— they were just Black Swans. The strategy for the discoverers and entrepreneurs is to rely less on top-down planning and focus on maximum tinkering and recognizing opportunities when they present themselves.
Who gets rewarded, the central banker who avoids a recession or the one who comes to “correct” his predecessors’ faults and happens to be there during some economic recovery? Who is more valuable, the politician who avoids a war or the one who starts a new one (and is lucky enough to win)?
This question is illustrative of a fundamental problem of incentives. One isn't incentivized to prevent ills, one is incentivized to fix them.
What I call Platonicity, after the ideas (and personality) of the philosopher Plato, is our tendency to mistake the map for the territory, to focus on pure and well-defined “forms,” whether objects, like triangles, or social notions, like utopias (societies built according to some blueprint of what “makes sense”), even nationalities. When these ideas and crisp constructs inhabit our minds, we privilege them over other less elegant objects, those with messier and less tractable structures (an idea that I will elaborate progressively throughout this book).
Note that I am not relying in this book on the beastly method of collecting selective “corroborating evidence.” For reasons I explain in Chapter 5, I call this overload of examples naïve empiricism— successions of anecdotes selected to fit a story do not constitute evidence. Anyone looking for confirmation will find enough of it to deceive himself— and no doubt his peers.* The Black Swan idea is based on the structure of randomness in empirical reality.
O. M. G. A book NOT in Gladwell's style? SIGN. ME. UP.
Also, likely part of the reason I like this book so much. It doesn't rely on anecdotes to prove things. It uses stories to move the book along and tie different elements together, but not as "proof" of things.
The writer Umberto Eco belongs to that small class of scholars who are encyclopedic, insightful, and nondull. He is the owner of a large personal library (containing thirty thousand books), and separates visitors into two categories: those who react with “Wow! Signore professore dottore Eco, what a library you have! How many of these books have you read?” and the others— a very small minority— who get the point that a private library is not an ego-boosting appendage but a research tool. Read books are far less valuable than unread ones. The library should contain as much of what you do not know as your financial means, mortgage rates, and the currently tight real-estate market allow you to put there. You will accumulate more knowledge and more books as you grow older, and the growing number of unread books on the shelves will look at you menacingly. Indeed, the more you know, the larger the rows of unread books. Let us call this collection of unread books an antilibrary.
Um... yes, that's what I'm building.
We thus managed to live in peace for more than a millennium almost devoid of bloodshed: our last true problem was the later troublemaking crusaders, not the Moslem Arabs. The Arabs, who seemed interested only in warfare (and poetry) and, later, the Ottoman Turks, who seemed only concerned with warfare (and pleasure), left to us the uninteresting pursuit of commerce and the less dangerous one of scholarship (like the translation of Aramaic and Greek texts).
I recall being at the center of the riot, and feeling a huge satisfaction upon my capture while my friends were scared of both prison and their parents. We frightened the government so much that we were granted amnesty.
Had I concealed my participation in the riot (as many friends did) and been discovered, instead of being openly defiant, I am certain that I would have been treated as a black sheep. It is one thing to be cosmetically defiant of authority by wearing unconventional clothes— what social scientists and economists call “cheap signaling”— and another to prove willingness to translate belief into action.
It may be that the invention of gunfire and powerful weapons turned what, in the age of the sword, would have been just tense conditions into a spiral of uncontrollable tit-for-tat warfare.
This duration blindness in the middle-aged exile is quite a widespread disease. Later, when I decided to avoid the exile’s obsession with his roots (exiles’ roots penetrate their personalities a bit too deeply), I studied exile literature precisely to avoid the traps of a consuming and obsessive nostalgia. These exiles seemed to have become prisoners of their memory of idyllic origin— they sat together with other prisoners of the past and spoke about the old country, and ate their traditional food while some of their folk music played in the background. They continuously ran counterfactuals in their minds, generating alternative scenarios that could have happened and prevented these historical ruptures, such as “if the Shah had not named this incompetent man as prime minister, we would still be there.” It was as if the historical rupture had a specific cause, and that the catastrophe could have been averted by removing that specific cause. So I pumped every displaced person I could find for information on their behavior during exile. Almost all act in the same way.
I felt closer to my roots during times of trouble and experienced the urge to come back and show support to those left behind who were often demoralized by the departures— and envious of the fair-weather friends who could seek economic and personal safety only to return for vacations during these occasional lulls in the conflict. I was unable to work or read when I was outside Lebanon while people were dying, but, paradoxically, I was less concerned by the events and able to pursue my intellectual interests guilt-free when I was inside Lebanon.
Categorizing is necessary for humans, but it becomes pathological when the category is seen as definitive, preventing people from considering the fuzziness of boundaries, let alone revising their categories.
The problem of overcausation does not lie with the journalist, but with the public. Nobody would pay one dollar to buy a series of abstract statistics reminiscent of a boring college lecture. We want to be told stories, and there is nothing wrong with that— except that we should check more thoroughly whether the story provides consequential distortions of reality.
We learn from repetition — at the expense of events that have not happened before. Events that are nonrepeatable are ignored before their occurrence, and overestimated after (for a while). After a Black Swan, such as September 11, 2001, people expect it to recur when in fact the odds of that happening have arguably been lowered. We like to think about specific and known Black Swans when in fact the very nature of randomness lies in its abstraction.
Update: this is what is happening to my older brother. CLEAR TEXT BOOK CASE.
We are often told that we humans have an optimistic bent, and that it is supposed to be good for us. This argument appears to justify general risk taking as a positive enterprise, and one that is glorified in the common culture. Hey, look, our ancestors took the challenges— while you, NNT, are encouraging us to do nothing (I am not). We have enough evidence to confirm that, indeed, we humans are an extremely lucky species, and that we got the genes of the risk takers. The foolish risk takers, that is. In fact, the Casanovas who survived.
The reference point argument is as follows: do not compute odds from the vantage point of the winning gambler (or the lucky Casanova, or the endlessly bouncing back New York City, or the invincible Carthage), but from all those who started in the cohort.
I wish people would understand this about statistics and luck and experiments. You can't look at only success, you have to look at failures, too.
We have to accept the fuzziness of the familiar “because” no matter how queasy it makes us feel (and it does makes us queasy to remove the analgesic illusion of causality). I repeat that we are explanation-seeking animals who tend to think that everything has an identifiable cause and grab the most apparent one as the explanation. Yet there may not be a visible because; to the contrary, frequently there is nothing, not even a spectrum of possible explanations. But silent evidence masks this fact. Whenever our survival is in play, the very notion of because is severely weakened. The condition of survival drowns all possible explanations.
And it is why we have Black Swans and never learn from their occurrence, because the ones that did not happen were too abstract.
We love the tangible, the confirmation, the palpable, the real, the visible, the concrete, the known, the seen, the vivid, the visual, the social, the embedded, the emotionally laden, the salient, the stereotypical, the moving, the theatrical, the romanced, the cosmetic, the official, the scholarly-sounding verbiage (b******t), the pompous Gaussian economist, the mathematicized crap, the pomp, the Académie Française, Harvard Business School, the Nobel Prize, dark business suits with white shirts and Ferragamo ties, the moving discourse, and the lurid. Most of all we favor the narrated.
Alas, we are not manufactured, in our current edition of the human race, to understand abstract matters— we need context. Randomness and uncertainty are abstractions. We respect what has happened, ignoring what could have happened. In other words, we are naturally shallow and superficial— and we do not know it. This is not a psychological problem; it comes from the main property of information. The dark side of the moon is harder to see; beaming light on it costs energy. In the same way, beaming light on the unseen is costly in both computational and mental effort.
First, we are demonstrably arrogant about what we think we know. We certainly know a lot, but we have a built-in tendency to think that we know a little bit more than we actually do, enough of that little bit to occasionally get into serious trouble. We shall see how you can verify, even measure, such arrogance in your own living room.
I remind the reader that I am not testing how much people know, but assessing the difference between what people actually know and how much they think they know.
The simple test above suggests the presence of an ingrained tendency in humans to underestimate outliers— or Black Swans. Left to our own devices, we tend to think that what happens every decade in fact only happens once every century, and, furthermore, that we know what’s going on.
.. additional knowledge of the minutiae of daily business can be useless, even actually toxic, ...
Show two groups of people a blurry image of a fire hydrant, blurry enough for them not to recognize what it is. For one group, increase the resolution slowly, in ten steps. For the second, do it faster, in five steps. Stop at a point where both groups have been presented an identical image and ask each of them to identify what they see. The members of the group that saw fewer intermediate steps are likely to recognize the hydrant much faster. Moral? The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
Our forecast errors have traditionally been enormous, and there may be no reasons for us to believe that we are suddenly in a more privileged position to see into the future compared to our blind predecessors. Forecasting by bureaucrats tends to be used for anxiety relief rather than for adequate policy making.
Even if you agree with a given forecast, you have to worry about the real possibility of significant divergence from it.
We build toys. Some of those toys change the world.
What we call here soft historical sciences are narrative dependent studies. Popper’s central argument is that in order to predict historical events you need to predict technological innovation, itself fundamentally unpredictable. “Fundamentally” unpredictable? I will explain what he means using a modern framework. Consider the following property of knowledge: If you expect that you will know tomorrow with certainty that your boyfriend has been cheating on you all this time, then you know today with certainty that your boyfriend is cheating on you and will take action today, say, by grabbing a pair of scissors and angrily cutting all his Ferragamo ties in half. You won’t tell yourself, This is what I will figure out tomorrow, but today is different so I will ignore the information and have a pleasant dinner. This point can be generalized to all forms of knowledge. There is actually a law in statistics called the law of iterated expectations, which I outline here in its strong form: if I expect to expect something at some date in the future, then I already expect that something at present.
To me utopia is an epistemocracy, a society in which anyone of rank is an epistemocrat, and where epistemocrats manage to be elected. It would be a society governed from the basis of the awareness of ignorance, not knowledge. Alas, one cannot assert authority by accepting one’s own fallibility. Simply, people need to be blinded by knowledge— we are made to follow leaders who can gather people together because the advantages of being in groups trump the disadvantages of being alone. It has been more profitable for us to bind together in the wrong direction than to be alone in the right one.
But if you are dealing with aggregates, where magnitudes do matter, such as income, your wealth, return on a portfolio, or book sales, then you will have a problem and get the wrong distribution if you use the Gaussian, as it does not belong there. One single number can disrupt all your averages; one single loss can eradicate a century of profits.
We have moved from a simple bet to something completely abstract. We have moved from observations into the realm of mathematics. In mathematics things have a purity to them.
Likewise, the Gaussian bell curve is set so that 68.2 percent of the observations fall between minus one and plus one standard deviations away from the average. I repeat: do not even try to understand whether standard deviation is average deviation— it is not, and a large (too large) number of people using the word standard deviation do not understand this point. Standard deviation is just a number that you scale things to, a matter of mere correspondence if phenomena were Gaussian.
Recall our discussions in Chapter 14 on preferential attachment and cumulative advantage. Both theories assert that winning today makes you more likely to win in the future. Therefore, probabilities are dependent on history, and the first central assumption leading to the Gaussian bell curve fails in reality. In games, of course, past winnings are not supposed to translate into an increased probability of future gains— but not so in real life, which is why I worry about teaching probability from games. But when winning leads to more winning, you are far more likely to see forty wins in a row than with a proto-Gaussian.
Being on the receiving end of angry insults is not that bad; you can get quickly used to it and focus on what is not said. Pit traders are trained to handle angry rants. If you work in the chaotic pits, someone in a particularly bad mood from losing money might start cursing at you until he injures his vocal cords, then forget about it and, an hour later, invite you to his Christmas party. So you become numb to insults, particularly if you teach yourself to imagine that the person uttering them is a variant of a noisy ape with little personal control. Just keep your composure, smile, focus on analyzing the speaker not the message, and you’ll win the argument. An ad hominem attack against an intellectual, not against an idea, is highly flattering. It indicates that the person does not have anything intelligent to say about your message.
The only comment I found unacceptable was, “You are right; we need you to remind us of the weakness of these methods, but you cannot throw the baby out with the bath water,” meaning that I needed to accept their reductive Gaussian distribution while also accepting that large deviations could occur— they didn’t realize the incompatibility of the two approaches. It was as if one could be half dead. Not one of these users of portfolio theory in twenty years of debates, explained how they could accept the Gaussian framework as well as large deviations. Not one.
Along the way I saw enough of the confirmation error to make Karl Popper stand up with rage. People would find data in which there were no jumps or extreme events, and show me a “proof” that one could use the Gaussian.
The entire statistical business confused absence of proof with proof of absence.
Furthermore, people did not understand the elementary asymmetry involved: you need one single observation to reject the Gaussian, but millions of observations will not fully confirm the validity of its application. Why? Because the Gaussian bell curve disallows large deviations, but tools of Extremistan, the alternative, do not disallow long quiet stretches.
Now, elegant mathematics has this property: it is perfectly right, not 99 percent so. This property appeals to mechanistic minds who do not want to deal with ambiguities. Unfortunately you have to cheat somewhere to make the world fit perfect mathematics; and you have to fudge your assumptions somewhere.
This is where you learn from the minds of military people and those who have responsibilities in security. They do not care about “perfect” ludic reasoning; they want realistic ecological assumptions. In the end, they care about lives.
I am most often irritated by those who attack the bishop but somehow fall for the securities analyst— those who exercise their skepticism against religion but not against economists, social scientists, and phony statisticians. Using the confirmation bias, these people will tell you that religion was horrible for mankind by counting deaths from the Inquisition and various religious wars. But they will not show you how many people were killed by nationalism, social science, and political theory under Stalinism or during the Vietnam War. Even priests don’t go to bishops when they feel ill: their first stop is the doctor’s.
Half the time I hate Black Swans, the other half I love them. I like the randomness that produces the texture of life, the positive accidents, the success of Apelles the painter, the potential gifts you do not have to pay for. Few understand the beauty in the story of Apelles; in fact, most people exercise their error avoidance by repressing the Apelles in them.
I worry less about small failures, more about large, potentially terminal ones.
I worry less about advertised and sensational risks, more about the more vicious hidden ones. I worry less about terrorism than about diabetes, less about matters people usually worry about because they are obvious worries, and more about matters that lie outside our consciousness and common discourse
I worry less about embarrassment than about missing an opportunity.
Snub your destiny. I have taught myself to resist running to keep on schedule. This may seem a very small piece of advice, but it registered. In refusing to run to catch trains, I have felt the true value of elegance and aesthetics in behavior, a sense of being in control of my time, my schedule, and my life. Missing a train is only painful if you run after it! Likewise, not matching the idea of success others expect from you is only painful if that’s what you are seeking.
You stand above the rat race and the pecking order, not outside of it, if you do so by choice.
Quitting a high-paying position, if it is your decision, will seem a better payoff than the utility of the money involved (this may seem crazy, but I’ve tried it and it works). This is the first step toward the stoic’s throwing a four-letter word at fate. You have far more control over your life if you decide on your criterion by yourself.
Mother Nature has given us some defense mechanisms: as in Aesop’s fable, one of these is our ability to consider that the grapes we cannot (or did not) reach are sour. But an aggressively stoic prior disdain and rejection of the grapes is even more rewarding. Be aggressive; be the one to resign, if you have the guts.
It is more difficult to be a loser in a game you set up yourself. In Black Swan terms, this means that you are exposed to the improbable only if you let it control you. You always control what you do; so make this your end.