Twitter
Advertisement

Black swans are a sucker’s problem

That’s the opening paragraph from Nassim Nicholas Taleb’s book The Black Swan - The Impact of the Highly Improbable.

Latest News
article-main
FacebookTwitterWhatsappLinkedin

MUMBAI: “Before the discovery of Australia, people in the Old World were convinced that all swans were white, an unassailable belief as it seemed completely confirmed by empirical evidence. The sighting of the first black swan might have been an interesting surprise for a few ornithologists (and others extremely concerned with the colouring of birds), but that is not where the significance of the story lies. It illustrates a severe limitation to our learning from observations or experience and the fragility of our knowledge. Our single observation can invalidate a general statement derived from millennia of confirmatory sightings of millions of white swans. All you need is one (and I am told, quite ugly) black bird.”

That’s the opening paragraph from Nassim Nicholas Taleb’s book The Black Swan - The Impact of the Highly Improbable. “The central idea of this book concerns our blindness with respect to randomness, particularly the large deviations” the author writes.

It is these large deviations from the normal that Taleb calls the “black swans.” Take the attack on the twin buildings of the World Trade Centre in New York on September 11, 2001. Or the war in the author’s native country, Lebanon, which people felt would end in a matter of days, but which went on for seventeen years.

And what makes these black swans particularly dangerous is that most of the times, they are unexpected. “Consider the turkey that is fed every day. Every single feeding will firm up the bird’s belief that it is the general rule of life to be fed every day by friendly members of the human race “looking out for its best interests,” as a politician would say. On the afternoon of the Wednesday before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief.”

Given this, “what we don’t know” becomes more important than “what we know.” However, this does not stop individuals from coming up with explanations for everything, even though they are unexplainable at most times. As Taleb writes, “as I formulated my ideas on the perception of random events, I developed the governing impression that our minds are wonderful explanation machines, capable of making sense out of almost anything, capable of mounting explanations for all manner of phenomena, and generally incapable of accepting the idea of unpredictability. These events were unexplainable, but intelligent people thought they were capable of providing convincing explanations for them - after the fact. Furthermore, the more intelligent the person, the better sounding the explanation.”

Taleb provides an interesting example on explanations using two Bloomberg News headlines that appeared within half an hour of one another in December 2003 on the day when Saddam Hussein was captured.

“Bloomberg News flashed the following headline at 13:01: US Treasuries Rise; Hussein Capture May Not Curb Terrorism.”

“At 13.31 they issued the next bulletin: US Treasuries Fall; Hussein Capture Boosts Allure of Risky Assets.”

As Taleb writes, “It happens all the time: a cause is proposed to make you swallow the news and make news more concrete.”

Now, that doesn’t mean that things happened because of the reasons being offered. “The problem of overcausation does not lie with the journalist, but with the public. Nobody would pay one dollar to buy a series of abstract statistics reminiscent of a boring college lecture. We want to be told stories, and there is nothing wrong with that - expect that we should check more thoroughly whether the story provides consequential distortions of reality… Just consider that newspapers try to get impeccable facts, but weave them into a narrative in such a way as to convey the impression of causality (and knowledge). There are fact checkers, not intellect-checkers. Alas.”

As far as explanations are concerned, the jury is still out on why the Dow Jones Industrial Index crashed on October 19, 1987. And once it had crashed every year, the traders expected the markets to crash in October 1987. “After the stock market crash of 1987, half of America’s traders braced for another one every October - not taking into account that there was no antecedent to the first one. We worry too late - ex post. Mistaking a naïve observation of the past as something definitive or representative is the one and only cause of our inability to understand the Black Swan,” writes Taleb.

This ex-post reasoning affects those who work in professions having high randomness. “People in professions of high randomness (such as in the markets) can suffer more than their share of toxic effect of look-back settings. I should have sold my portfolio at the top; I could have bought that stock years ago for pennies and I would now be driving a pink convertible; etcetera.”

The way out of this constant worry is to keep a daily diary. “If you work in a randomness-laden profession, as we see, you are likely to suffer burnout effects from that constant second-guessing of your past actions in terms of what played out subsequently. Keeping a diary is the least you can do in these circumstances.”

Given the fact that we worry about “black swans” too much after they have occurred, we are not prepared for them when they happen. And one such pedigree is that of bankers. As Taleb writes, “In the summer of 1982, large American banks lost close to all their past earnings (cumulatively), about everything that they ever made in the history of American banking - everything. They had been lending to South and Central American countries that all defaulted at the same time - “an event of an exceptional nature.” So it took just one summer to figure out that this was a sucker’s business and that all their earnings came from a very risky game. All that while the bankers led everyone especially themselves into believing that they were “conservative.” ...They are not conservative; just phenomenally skilled at self-deception by burying the possibility of a large, devastating loss under the rug.” The same thing can be written as, “From the standpoint of the turkey, the non-feeding of the one thousand and first day is a Black Swan. For the butcher, it is not, since its occurrence is not unexpected. So you can see here that the Black Swan is a sucker’s problem.”

Is there a way out of these ‘black swan’ events? As Taleb writes, “The probabilities of rare events are not computable; the effect of an event on us is considerably easier to ascertain (the rarer the event, the fuzzier the odds). We can have a clear idea of the consequences of an event even if we do not know how likely it is to occur. I don’t know the odds of an earthquake, but I can imagine how San Francisco might be affected by one. This idea that in order to make a decision you need to focus on the consequences (which you can know) rather than the probability (which you can’t know) is the central idea of uncertainty… All you have to do is mitigate the consequences. As I said, if my portfolio is exposed to the market crash, the odds of which I cannot compute, all I have to do is buy insurance, or get out and invest the amounts I am not willing to ever lose in the less risky securities.”

Find your daily dose of news & explainers in your WhatsApp. Stay updated, Stay informed-  Follow DNA on WhatsApp.
    Advertisement

    Live tv

    Advertisement
    Advertisement