heuristics that help us deal with day to day problems can also frequently lead us astray. I had understood for a long time that people often self-deceived, but even I was surprised by the depth and breadth of the way we misperceive the world (me included). I learned the lessons of decisionmaking and rationality and tried to become more rational myself. I have since noticed that many of my friends who did fine understanding the concepts but lacked the humility or insight to see an application in their own lives.
My friend I mentioned earlier is a good example. He is so afraid of making a bad decision that he avoids making them until they are made for him. That or he waits until his fear and panic of making the decision cause him to take action, any action at all, but all in a fog of willful ignorance -- pretending that certain facts don't exist or (intentionally?) misrepresenting probabilistic outcomes in his mind. All of this is done in an attempt to shield himself from self-hatred or acknowledging certain basic truths about the world that he would rather ignore. I see this sort of ex post self-justification happen in the comments section of this blog from people who are doomed to repeat past mistakes because they refuse a sense of responsibility about their own destiny.
It's a good example of seeing what you want to see and the harm that can come from it. I actually advise people to not even form a belief if they can -- the temptation to anchor their future assessments or see all new information through the distorted lens of whether or not it confirms that belief is just simply too high.
Even if people get the probability correctly, they often don't understand what that means. It's one thing to say there's a 1% of getting caught stealing a mobile phone, but many people have trouble understanding that means if you steal 100 phones you will statistically get caught once. Instead they act as if anything less than 5% means never going to happen no matter how repeatedly they engage in it. Which is why I liked this recent article in the NY Times about understanding low probability risks. It's worth reading in its entirety, here's a teaser story:
I first became aware of the New Guineans’ attitude toward risk on a trip into a forest when I proposed pitching our tents under a tall and beautiful tree. To my surprise, my New Guinea friends absolutely refused. They explained that the tree was dead and might fall on us.
Yes, I had to agree, it was indeed dead. But I objected that it was so solid that it would be standing for many years. The New Guineans were unswayed, opting instead to sleep in the open without a tent.
I thought that their fears were greatly exaggerated, verging on paranoia. In the following years, though, I came to realize that every night that I camped in a New Guinea forest, I heard a tree falling. And when I did a frequency/risk calculation, I understood their point of view.
Consider: If you’re a New Guinean living in the forest, and if you adopt the bad habit of sleeping under dead trees whose odds of falling on you that particular night are only 1 in 1,000, you’ll be dead within a few years. In fact, my wife was nearly killed by a falling tree last year, and I’ve survived numerous nearly fatal situations in New Guinea.