Sunday, 28 July 2013

Book review: Thinking, Fast and Slow

Title: Thinking, Fast and Slow (2011)
Author: Daniel Kahneman
Publisher: Penguin Books

This book gives us insight into how our brains work. How they are hard-wired for shortcuts that require the near-instantaneous processing of a large amount of information. Often these ‘decisions’ (more like reactions) are right, but they can also be wrong, and the biases that are inherent in our decision making and our memories often give us a perception that does not match reality.

Our initial reactions may be wrong, but they didn’t evolve to be scientifically sound, they evolved to be life saving and therefore to play it safe.  For example, when confronted with a strange animal or food, it is in our interests to avoid it. We can sit around and evaluate the available evidence and take measurements and even ask around, and maybe even change our minds, but often we won’t bother, and will just go with our first impression.

Two systems
The author describes our thinking as having two systems. The fast System 1 is automatic, efficient, uses shortcuts and is gullible and biased to believe. System 2 is the doubting and fact-checking type of thinking that is not automatic: it requires effort and it is naturally lazy. System 2, however, often rationalises or simply endorses the impression gained by System 1.

Two selves
The author describes two selves: the remembering self and the experiencing self. Biases exist such that we often remember things differently to how we experienced them. A good example is provided about how patients remember painful experiences, and the discrepancies between the memory and exactly how much pain they were in at the time (the experience).

Two people
Relating to economics, much economic theory is based on the rational person (an “Econ”). The reality is, however, that the assumptions of economic theories relating to human behaviour are often wrong, as “Humans” don’t always behave rationally and they don’t always act purely out of self-interest.

The book is well written and filled with examples from the author’s career as a researcher in psychology that are both entertaining and relevant. The biases that lead to our errors in judgement and decision making are discussed in detail. The author makes it clear that the two ‘systems’ that he describes do not exist – they are just a tool to describe what happens.

How does this relate to medicine?
The book is about how we (doctors and patients) make decisions and how we remember things. The ease with which we jump to conclusions, particularly about cause and effect, is a recurring theme of this blog, as I see this and the many biases that affect our decisions and our memories as a major cause for the overestimation of benefit and underestimation of harm in medicine.

Though many of his explanations relate to economic theories, he often refers to medical topics, such as pain perception and patient-rated outcomes and the biases that plague these areas. In particular, the author acknowledges the problems with basing policy on (inherently biased) patient perception, but also acknowledges problems with ignoring patient perception. He states:

“A theory of well being that ignores what people want cannot be sustained. On the other hand, a theory that ignores what actually happens in people’s lives and focuses exclusively on what they think about their life is not tenable either. The remembering self and the experiencing self must both be considered, because their interests do not always coincide.”

The following two quotes are applicable to the message of this blog:

“[There is an argument] that we see causality, just a directly as we see color.”

“… we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify.”

The bottom line
For anyone interested in how we think and behave, this book is recommended. It is relevant to this blog because it explains many of the biases that lead us to believe that medicine is more effective and less harmful than it really is. 

2 comments: