When the Deepwater Horizon> rig sank into the Gulf of Mexico on April 22, something remarkable happened: Millions of Americans became retrospective experts on deepwater oil exploration. Of course, an accident like this was just around the corner. Of course it couldn t be fixed with a containment dome (those darn frozen hydrocarbons). Of course a top kill and a junk shot were doomed to fail! Golf balls? Golf balls!?
And when the spill is eventually sealed off with an underground nuclear explosion, and when that (inevitably) wakes Godzilla, who proceeds to demolish what s left of New Orleans before heading up the Mississippi River to spawn, no doubt there will be one man, one working-class hero standing atop a stack of nuclear-lizard eggs, turning to the newly minted radiological expert next to him, holding forth on how he just knew something like this was bound to happen.
We don t like to be surprised. Our lives are based around the idea that the world is a fairly orderly place and that we have a pretty good idea what s coming next. So, when something goes wrong, we tell ourselves a convenient lie: We saw it coming.
The phenomenon is called hindsight bias, and since psychologists started studying it in the mid-1970s, it s been found to be pervasive in our minds and in our lives.
One of the first studies on hindsight bias was conducted just as President Nixon was about to leave for his historic trip to China. Social-science researcher Baruch Fischhoff asked people to rate the probabilities of various outcomes to Nixon s trip (e.g., President Nixon will meet Chairman Mao; President Nixon will announce that the trip was a success).
What Fischhoff and his team found was that when people were asked about their predictions after the president s trip had concluded, the subjects remembered assigning higher probabilities than they actually had to events that they thought had occurred, and they remembered assigning lower probabilities to events that they thought hadn t occurred. (I say thought had occurred and thought hadn t occurred because people s knowledge of what actually happened was a bit sketchy, as the public knowledge of current events often is.)
Results like this have been replicated over and over. How likely did you think that it was that Barack Obama would be elected president of the United States in 2008? It seems today like it was certain all along, doesn t it?
And the effect seems to get stronger with time. A study of students predictions about the O.J. Simpson verdict in 1995 found that their estimate of how likely they had thought it was that O.J. would be acquitted (when asked two hours before the verdict was announced and then at various time periods afterward) rose after he was, in fact, acquitted but the effect did not take hold significantly until a week after the verdict.
The question, of course, is how dangerous this effect is. One problem with seeing what happens as what was inevitably always going to happen comes when it s time to assign blame for an accident. In a case like with BP, of course, the public is left baying for blood. BP knew it was going to cause an environmental catastrophe or at least that it was extremely likely and thus its top executives are wanton criminals. Of course the executives, subject to their own cognitive biases, probably thought an accident (certainly one of this magnitude) was highly unlikely.
Likewise, in liability cases such as with car accidents or medical malpractice, juries can only look at the evidence in front of them through the filter of hindsight bias. How likely was a person s driving to cause a terrible accident? Well, it did cause the accident, so it must have been rather likely and the driver must have known this. How likely was the risky surgery to cause an infection and eventually death? Again, since that s what ultimately happened, the doctor must have known that he was taking an unacceptable risk.
And, of course, hindsight bias presents major problems for investors. When you misremember how well you evaluated risk in the past, you make bigger mistakes evaluating risk going forward.
For instance, a study published in 2009 in Management Science found that the greater a person s hindsight bias, the more he or she underestimates volatility in the stock market. Looking at performance in the real world, furthermore, they found that the earnings of bankers in Frankfurt and London varied with the extent to which they exhibited hindsight bias those with more of the bias performing worse, and those with less of the bias performing better.
Of course, that s the result the researchers expected all along.
Ryan Sager writes the blog Neuroworld at TrueSlant.com.>