By far, one of the greatest books I’ve had the pleasure of reading. Daniel Kahneman goes into such details for biases in availability, to substitution of hard questions to how we feel about such difficult questions, and how we try to find causes where there are none.
There is a plethora of riveting, academic, and simply phenomenal information that one can gain from this reading. Not only would a reader feel less worry about abnormal events such as terrorism, but may even gain a better understanding of mathematics too.
Kahneman’s book is divided into 5 parts that range from our cognitive biases such as biases of heuristics, to our biases of stories that we tell ourselves, our narrow focus and overestimation on abnormal issues without a proper understanding of probabilities, and the shocking difference between our actual experiences and how we remember them.
Examples include the overestimation of terrorist events because they’re far out of the norm compared to statistically more likely dangers such as car crashes, how we choose certain activities based on our recollection of how pleasurable or painful they ended instead of the length of time of the activity, and how we make financial decisions based on reference points from our previous socioeconomic status.
I probably sound like a fanboy but I think it might be accurate to say that I am a fanboy of this book. I highly recommend it to anyone. It is lengthy but the information is worthwhile. I cannot truly go into the depth of what this book covers. Try a free sample if you have a kindle.
Listed below is my understanding of certain aspects of cognitive biases that the book covers; understanding the issue is the first step to preparing defenses against it:
System 1: utilizes intuition.
System 2: Effortful/contemplative and lazy.
Base-rate fallacy, ignoring the percentage of people because the story sounds consistent with our automatic biases.
Anchoring: Moving slightly above or below a set standard that has been set. For example, realtors being given the price of 70,000 for a 90,000 dollar house and moving slightly above or slightly below the initial starting price despite it being far below the actual value of the house. Anchoring doesn’t work in circumstances in which the price or subject matter is immediately considered unreasonable such as asking 10,000 for a 90,000 dollar house.
Framing: How we set-up questions to get the answers we want. For example, food being described as 1% fat free versus 98% Fat.
Understanding talent: Talent is consistently performing an action, having constant feedback towards that action, and learning mini-skills to eventually raise their ability up to a higher level. A “talent” is a set of several learned mini-skills.
Cognitive ease: filling the gaps of what we don’t know with what we expect to be reasonable information. For example, believing that police officers are lawfully bound to protect the public despite this lacking legal basis. (More on that here: http://www.nytimes.com/2005/06/28/politics/justices-rule-police-do-not-have-a-constitutional-duty-to-protect-someone.html)
Cognitive ease usually consists of the following:
People being unable to understand how they previously defended arguments before they changed their mind.
Theory-induced blindness: Being unable to see the flaws of a theory until after one’s mind has been changed.
People try to find causes where there are none such as rural areas that have low populations having extremely low or extremely high cancer rates. Events are in a flux and life sometimes has a regression to the mean in just about every circumstance in life.
We have trouble distinguishing rumors from factual evidence because our system 1 absorbs them as if they’re equal.
We tend to have a planning fallacy, ignoring how many others are doing the same thing that we are and we try to ignore how statistics apply to our behavior and the behavior of people we know.
Overall, 5/5 stars. It has been a great pleasure to read. I cannot recommend it enough!