One thing that I’ve been always fascinated with is the way people think. What drives people to decide A instead of B? Why the hell do they make bad and stupid decisions even when there’s a rational and obvious answer for the problem?
By growing up in a family which loves to read, I was often suggested with some books. The one that I caught my attention the most was Thinking, fast and slow by Daniel Kahneman – who won the Nobel Memorial Prize in Economic Sciences in 2002.
The two systems
The main idea of this book and research is related to the two different ways of human being to think and make decisions:
“System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control”;
“System 2 allocates attention to the effortful mental activities that demand it, including complex computations”.
Despite being essential for some situations, like surviving, system 1 often leads to heuristics that can be bad for us. One example is the consequence of interpreting information in a certain way due to how the situation is framed.
As Kahneman refers in his book, if your doctor tells you that in the operation that you’re about to make, you have 90% of chances to survive, you’ll think that is good, right? But if he tells you that you have 10% of chances to die, maybe you’ll think twice. The situation is the same; it all depends on how it’s framed.
Like these, there are several other examples that show how people are often betrayed by the context, judgments and their own reasoning.
Sunk costs are an interesting case too. People tend to keep on investing money with bad prospects just because they already lost that money. Even knowing that the invested money was already spent and won’t be recovered ever again.
All this research shows that humans don’t act rationally all the time. Emotions also play a very important role – which is not necessarily bad. Intuition can also save our lives but it leads us to make stupid decisions as well.
I was amazed with how it is possible to have all this studied and scientifically proven, which is supposed to make us more aware but at the same time, but at the same time how few the resources that we can use in order to fight these reasoning biases are.