An individual has been described by a neighbor as follow: “Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality.  A meek and tidy soul, he has a need for order and structure.”
Is Steve more likely to be a librarian or a farmer?

‘System 1’ in our minds, seeking patterns and coherence, leads one intuitively to the stereotypical description of a librarian.  ‘System 2’: slower, more analytical, harder work, may invite one to consider that there are twenty times the number of men farming as working in libraries in America. Our comfortable reliance on System 1 was the first insight I gained from Daniel Kahneman’s Thinking, Fast and Slow.  Reading the book, I noticed its lessons arising everywhere: in school, in supermarkets, in what I read.  Not having seen the book reviewed online however, I thought it might be useful to do so here, although I can barely do more than cherry-pick some of its most memorable features and implications.

As thinkers, Daniel Kahneman seeks to show that the equipment on which we rely is unreliable.

  • Most of the time, we rely on the coherent picture of the world offered by System 1.  Automatic, involuntary reasoning guides us most of the time: we judge facial expressions and tone and establish narratives around which we base our actions instinctively.
  • System 2 can help us deal with more challenging questions.  When asked to multiply 17 and 24, or count the number of commas on a page, System 2 is set to work.  It is lazy however, difficult to engage and prone simply to validate the conclusions System 1 has reached (System 2 may seek out data to justify our first impressions, for example).  Nor could we possibly engage System 2 permanently (imagine deciding on every step on the way to work).
  • Humans judge situations poorly – but  are confident in our judgements.  A few high profile, memorable cases of an incident (a crime, for example) outweigh any form of statistical reasoning in our minds.  Our responses may, for example, be anchored – being asked whether Gandhi was 144 when he died, then asked his age at death, will increase our estimates in answer to the second question, despite the obvious implausibility of the first questions.  But we will overlook the weaknesses of our own reasoning and the data available to us, most arrestingly in using ‘hindsight bias’ – rewriting our own beliefs, such that we believe sincerely that we ‘knew’ the financial crisis of 2008 would occur, for example
  • We are deceived in our choices; we are deceived by our memories.  The idealised models according to which we ‘rationally’ choose, are nonsense: for example, we treat the loss of a pound with far more concern than we do the chance to gain one.  Moreover, we express different preferences during an event and after an event: for example, people prefer ninety seconds with a hand in freezing cold water, followed by a very slight rise in the temperature, to sixty seconds with a hand in freezing cold water and no temperature rise.

I’ve isolated a striking example to illustrate each contention; Kahneman draws on a lifetime’s work to offer surprising and convincing psychological experiments in evidence.

Daniel Kahneman
Daniel Kahneman

So much for Kahneman, what about us?

How could a teacher use these insights?

  • Peak-end rule: our memory of an event combines the feeling at the most intense moment and at the end.  When finishing a lesson which has not gone to plan, the promise of a solution or highlighting a bright spot, no matter how small, will leave students with a more positive feeling than the lesson as a whole may deserve, and will provide foundations for greater success next time.
  • Personalising statistics: Four in ten has far greater impact than 40%: in describing the effect of the Black Death on Britain’s population, I now make sure I employ the former.
  • Switching on System 2: Visiting Keith Bedson, a great history teacher in Withernsea, I heard him ask students to identify three areas of disagreement between Stalin and the Allies in a text.  There were only two, but this switched on System 2, leading me (and presumably them) to reread the whole page thoroughly looking for what I’d missed.

This is just the beginning: I’m sure Kahneman’s work can be put to far greater use – I’m just not sure what it is yet.  To discover what this is, I suspect we will have to reconcile the conflict between a desire to get our students thinking hard using System 2, with the recognition that this is exhausting and we also want them to attain a degree of fluency and intuition (in essay structuring, for example) redolent of System 1.  This also puts us into ethically ambiguous territory; for example, an easy way to rig any survey of satisfaction occurred to me mid-way through the book.  I’ve not used it, but teachers will have to choose what they employ from this work carefully and critically.

Can we catch ourselves thinking poorly?

Kahneman argues that it is hard for us to modify even his own thinking, despite a lifetime’s work (we remain over-confident despite our awareness of this flaw).  “As I know from experience, System 1 is not readily educable…  my intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues.”

In reflecting on this principle, I was able to identify times I’d slipped up.  For example, I’d fallen right into a belief that ‘base rates’ (average success at a venture) do not apply to oneself in reading Laura McInerney’s work on the ‘predictable failings of free schools’ and concluding, with insufficient reflection, that none of them applied to us.

Conversely, however, I scored a couple of minor (very minor) successes while reading the book.  Standing exhausted in a supermarket, I was able to catch System 1’s attraction to at an easy answer (a bright yellow price label) which offered me two toothbrushes for £3, and notice that, individually, they cost £1.50 (I concluded I was too tired to decide on dental products for the evening).  More potently, I mentioned to a colleague before a round of job interviews that “We’re likely to think Candidate B is better than Candidate A, because he is teaching Class Y.”  I wanted us to avoid succumbing to the ‘halo effect’ which would make Candidate B appear a better teacher due to that class’s consistently greater effort and interest in the subject (we hired Candidate B anyway, rightly so I think).

In short, it’s not impossible to catch ourselves reasoning poorly.  This said, in noting these successes, I’m probably reinforcing false confidence in my judgment once again.

Can we catch others thinking poorly?

Having noted how hard it is to catch ourselves thinking poorly, especially under pressure, Kahneman draws an interesting corollary: “The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so.”

My immediate thought was that this seems perfectly reasonable, but that I don’t expect to be thanked for my pains in pointing out the flaws in another’s reasoning.  To ease us in, Kahneman offers brief quotations at the end of each chapter, ‘Speaking of…’ in which he verbalises how an individual might discuss its findings.  “Speaking of judges vs. formulas” for example, one might say “Let’s decide in advance what weight to give to the data we have on candidates’ past performance.  Otherwise we will give too much weight to our impression from the interviews.”

In the midst of ‘improving’ (our Ofsted monitoring visit is this Monday) as a school, knowledge of the halo effect, awareness that ‘What You See is All There Is,’ or that the first thing an individual notice influences us most powerfully are apt to be brushed aside.

On the other hand, a combination of Kahneman and words of Dylan Wiliam’s offer a liberating conclusion.  Hindsight bias convinces highly successful individuals and organisations that they are authors of their own success – making bold and infallible decisions and providing compelling narratives.  Google’s inexorable rise provides one such example.

Unfortunately, there is good reason to believe that your senses of understanding and learning from the Google story is largely illusory.  The ultimate test of an explanation is whether it would have made the event predictable in advance.  No story of Google’s unlikely success will meet that test, because no story can include the myriad of events that would have caused a different outcome….  The fact that many of the important events that did occur involve choices further tempts you to exaggerate the role of skill and underestimate the part that luck played in the outcome… The halo effect adds the final touches, lending an aura of invincibility to the heroes of the story.”

In reality, it is our capacity to rewrite our beliefs to fit the results which is at work here.  Those who seek to follow in their footsteps of ‘the great’ through imitation are unlikely to receive similar rewards: even if they identify the ingredients of success correctly, the cards will almost certainly fall differently for them.

This may look depressing, but I believe it’s liberating.  If one copies today’s outstanding winners, there is no guarantee it’ll do any good at all.  So an individual or a leader may as well try to do the right thing and follow their conscience: it may work (one can rarely predict); even if it doesn’t one could have chosen little better and at least are left with a clear conscience.*

Until everyone has read Thinking, Fast and Slow, I can see little of any of this catching on.  The sooner everyone reads the book, the better.

Further reading
The book itself.
An excellent review from the London Review of Books.

* Large parts of the insights in this conclusion are Dylan Wiliam’s

Thinking,_Fast_and_Slow