I was recommended this book some time ago by a friend, and after checking the summary I added it to my list of book to read right away. This book is the first book I have tried to read using the “How to Read a Book” method, so take my opinion with an extra grain of salt: probably my experience reading it would have been very different if I had read it the usual way.

So, the executive summary would be that the ideas in the book are quite interesting, but it’s way too long and it’s often, in my opinion, annoying to read due to the author’s arrogance (you can probably imagine what I mean by looking at his website “Fooled By Randomness“).

The rest of this post is my random notes that sort of serve as a summary. They’re meant mostly for myself (or at least someone who has actually read the book) and probably fairly bad, but hey, it’s the first book I read like this, so bear with me. If you haven’t read the book and want to read them anyway, at least you have to know what a “Black Swan” is: it’s an event that it’s basically unpredictable, and changes the world in a substantial way. Just go to Wikipedia and check it out.

  • From page 8: History is opaque. Reasons:
    • Illusion of understanding: the world is more complex and random that everyone thinks.
    • Retrospective distortion: we assess matters after the fact and look for tidy, regular explanations.
    • Overvaluation of factual information and experts: we “Platonify” the world.
  • Page 12 (about the second point above): history makes jumps, not small increments of change. Yet we believe in nice, tidy, incremental changes.
  • Page 30: About scalable vs. non-scalable jobs (writer vs. nurse; getting paid for your time or not), “talent” comes from success, not the other way around.
  • Page 49: the book is not about avoiding risks, but about knowing which ones to take and know what we don’t know.
  • Page 50: Black Swan blindness, related themes:
    • We focus on preselected segments and generalise from them: confirmation error.
    • We believe in tidy explanations: the narrative fallacy.
    • We behave as if Black Swans don’t exist.
    • We don’t see all it’s there: we hide Black Swans under other explanations.
    • We “tunnel”: we focus on well-defined sources of information.
  • Page 58, about the “confirmation error” above. Experiment: given 2, 4 and 6, people are asked to guess the rule they follow. Each person can give any number of three-number series and the experimenters will say if the series follows the rule. In that experiment, people tend to first build a theory and then try to confirm it. So, most people never guess that it’s simply “ascending numbers”.
  • Page 71, about people behaving as if Black Swans didn’t exist: When you remember something, you change the story at each remembrance. We renarrate the past to make it “more logical”.
  • Page 114, about hiding Black Swans under other explanations: We don’t hear the stories of the non-successful, so the information we have comes mostly from the lucky ones.
  • Page 120, still about the same topic: When survival is in play, we look for cause and effect. We believe in the “because” and not in randomness. It may have been just luck, but we always try to find a cause.
  • Page 138 has a summary about chapter 10 (all notes up to and including page 158 belong to this chapter). There are two main topics in this chapter: (a) we are arrogant about what we think we know, and (b) that has implications when predicting. Why do we predict so much, even if we know we make so many mistakes?
  • Page 144. Ideas are sticky: once we have a theory, it’s hard to change our minds. We have trouble interpreting information that contradicts our opinions. Experiment with horse race prediction: knowing the 10 most useful variables, people predicted.  Then, when given extra variables, the accuracy of predictions didn’t increase, but the confidence in the predictions did.
  • Page 151: When you predict wrong, you tend to think you couldn’t know because it was an aspect you don’t know that well (e.g. about predicting the fall of the Soviet Union when having an excellent knowledge of the political workings, one would think that it turned out to be economic reasons, so you couldn’t predict it).
  • Page 158: We anchor: when we see a number before a prediction, even if it’s random and we know it, we make predictions “close” to that number. This, by the way, I had read before, I think in “Predictably irrational“.
  • Page 203: Advice: be human, admit your arrogance and ignorance. Avoid large scale, harmful predictions.
  • Page 205: Advice: put 85-90% of your resources in something very low risk, and 10-15% in something very high risk. Avoid “medium risks”.
  • Page 207: Closing tricks:
    • Make a difference between positive and negative contingencies. When you have a limited loss, you have to be as aggressive, speculative and “unreasonable” as you can.
    • Don’t look for the precise and local. Don’t be narrow-minded. Do not try to predict precise Black Swans. Invest in preparedness, not prediction. Infinite vigilance is not possible.
    • Seize any opportunity, or anything that looks like one. They’re much rarer than people think.
    • Beware of precise plans by governments.
    • Do not waste time trying to fight forecasters.

In summary, I liked the ideas in the book, even if sometimes I wasn’t very convinced by the arguments or the evidence provided… and it was sort of boring to read at times.