I’m not sure of much in education. But My Teaching Partner seems to work. It offers teachers coaching through video. It’s been rigorously tested: in primary and secondary schools, in rural and urban areas, with fortnightly and monthly meetings. Every time, students learn more – and it helps teachers make lasting changes (Gregory et al., 2017).*

So where do I sign up? First, make sure you’re American. Be sure to teach either maths or English. Then convince your whole school district to sign up. That done, you can join a research study. You may get a coach – or you may be randomly allocated not to.

I shouldn’t be too flippant: rigorous research is vital. But currently, we can’t access the MTP programme. Can we learn from it instead?

Four colleagues meet. Each has read about MTP. Each thinks it should influence their school’s professional development programme.

  • For Monica, effective teaching is all about relationships. She notes that MTP focuses on improving teacher/student interactions.
  • Josh loves instructional coaching. He notes that MTP offers instructional coaching.
  • Chris has heard that professional development should be sustained, collaborative, practice-based, and should include external expertise. (He hasn’t heard how weak the evidence for these claims is.) He notes that MTP is sustained, collaborative, practice-based, and includes external expertise.
  • Andrea is obsessed with feedback. She notes that MTP gives carefully designed feedback.

When we examine success, the successful thing becomes a magic mirror. We look into it, and all we see are our own practices, priorities and prejudices.

No one disputes that MTP works. But Monica, Josh, Chris and Andrea’s attempts to learn from it – and so design a better professional development programme – are doomed.

How can we do better?

How do things work in Hamburg?

Imagine if we could describe precisely how MTP helped change teaching practice. Instead of looking at success and guessing what caused it, we could pinpoint the mechanisms at work.

Robert Koch did just this. Not for professional development: for something more urgent. Why was cholera spreading in Hamburg and Altona in Winter 1892? Hamburg suffered a big outbreak. Altona (mostly) didn’t. Why?

Problems can be magic mirrors too, revealing our own practices, priorities and prejudices. Max von Pettenkoffer (professor of medicinal chemistry, Munich) insisted cholera was spread by bad air.

But Koch had isolated the cholera bacterium. He knew precisely how the disease spread. That meant he could explain events – and learn from them. Altona filtered its water through sand: Koch could show that this removed cholera bacteria. Altona had a small outbreak: Koch could identify the polluted well causing it.

Pinpointing the mechanism solved the problem. Hamburg encouraged residents to boil their water. Altona’s authorities closed the well. Max von P asked Koch for a sample of cholera, drank it(!), and got a very upset stomach (really; for the full story and its implications for understanding causes, see Clarke et al., 2013).

If we can identify how things work – the mechanisms – we can shatter the magic mirror and take meaningful action. Is this possible for professional development?

What mechanisms might influence teachers?

Let’s go back to the MTP coaching programme. It:

  • Builds teacher knowledge
  • Coaches teachers
  • Creates a supportive environment

These are all great things. But if we ask coaches to ‘build knowledge’ and ‘create a supportive environment,’ we give them goals, not tools.

So, like Koch seeking causes, we need a more precise way to describe how professional development worked. We began with a list of 93 possible mechanisms to change behaviour, developed by health psychologists (Michie et al., 2013). These include:

  • Setting goals
  • Providing evidence from a credible source
  • Rehearsing: practising a new skill before trying to use it in the classroom

Hopefully, the contrast is clear. If we know a coach ‘built knowledge,’ we still have to work out how they did it. If we know they ‘set goals’ and ‘provided evidence from a credible source,’ we’re much clearer on what’s making a difference for teachers. (This still leaves plenty of scope for the coach’s skill: there are ways and ways to set goals.)

How do we know that setting goals and providing evidence are mechanisms? We used tarot cards. Also, for each possible mechanism, we looked for an evidence review showing that it consistently makes a difference to people’s behaviour. For example, a review of over a hundred studies finds that, when people have a goal, they’re more likely to act (Epton et al., 2017).

We whittled the list of 93 possible behaviour change mechanisms down to 14 that were plausible (listed below). That meant we had evidence for them, and we thought they could actually work as part of a professional development programme. For example, we couldn’t find evidence that getting people to consider the gap between their current and desired goal makes a big difference. And we couldn’t imagine even the worst professional development providers giving drugs or punishments – or advising teachers to do something undesirable in the hope they get sick of it (if you know different, let me know). With this list in hand, we could see what difference mechanisms made to teachers.

Do these mechanisms influence teachers?

Next, we did three things:

  1. Collect up all the randomised controlled trials of professional development interventions we could find (over 100)
  2. Find out how much they increased student learning (if at all)
  3. Read the description of the intervention to identify the mechanisms they used

(We also did another 100+ pages of work, but I have a deadline.)

The graph below captures how mechanisms relate to impact. Each bubble is a study (bigger studies = bigger bubbles). The higher they are, the more students learned when their teachers participated in the programme. The further right they are, the more mechanisms the programmes used to help teachers improve.

The line of best fit makes the case pretty obvious. No mechanisms – you expect no impact. All 14 mechanisms, you expect a big impact (an effect size of 0.17).

But surely content matters? Yes. But – in two out of three cases – we found that mechanisms made a difference to programmes with similar content:

A programme about formative assessment with lots of mechanisms is more likely to have an impact than a programme about formative assessment without. (We only found seven studies of data-driven instruction, which makes firm conclusions harder.)

What does this mean?

Professional development is about helping teachers change. That doesn’t preclude being sustained and collaborative, attending to context and encouraging collaboration, or even providing a good lunch. But a programme that doesn’t help teachers change cannot improve student learning.

The behaviour change mechanisms we collected have been tested in many contexts: they consistently help people change. Perhaps unsurprisingly therefore, when they are tried in professional development interventions, they consistently help teachers change – and so help students learn more.

So if you want to help teachers change, behaviour change mechanisms are a very good bet.

What does this not mean?

When we examine success, the successful thing becomes a magic mirror. We look into it, and all we see are our own practices, priorities and prejudices.

If your first thought on checking the list of mechanisms is “This is obvious, we’d already worked this out, we already do all those things, we just haven’t called them mechanisms, come back when you’ve got some really interesting findings” I’m not going to argue with you. I am going to wonder why you didn’t publish your thoughts, which would have saved us a year of work. And I am going to wonder whether the magic mirror has got to you.

If your first action is to review your existing programme and call everything you already do a ‘mechanism,’ I’m also not going to argue with you. But I am going to ask you to let me know: it means I haven’t explained our research very well. We didn’t just pick a list of mechanisms at random: we worked from a list that dozens of experts spent years developing. Then we whittled it down by identifying evidence that these mechanisms matter. You may be proud of your ‘evidence-based curriculum.’ Your Fairtrade coffee and biscuits may get rave reviews. But that doesn’t mean they are now mechanisms (again, the full list of extant mechanisms is below).

Conclusion

If we want to learn from effective professional development programmes like MTP, we need to know precisely how they worked.

We can do this by identifying the behaviour change mechanisms they used.

We developed a list of mechanisms (below), found evidence they change behaviour, and showed that, when used in professional development, they lead to better student learning.

So you may be able to improve your professional development by examining the list of mechanisms, and picking one or two to use more.

Alternatively, it will be quicker and easier just to describe everything you already do as a ‘mechanism.’

If you enjoyed this, you may like

In this post, I described three questions we need to answer for professional development to work: what to learn, how to learn it, and how to make it work in reality?

In this post, I described our previous research questioning established wisdom around what works in professional development.

I introduced mechanisms, and the value of specificity, here.

This post describes findings from our systematic review of professional development (with Sam Sims, Alison O’Mara-Eves, Sarah Cottingham and other colleagues). This EEF guidance report is based on it.

* Confusingly, a programme called My Teaching Partner has proved unsuccessful in kindergarten (Whittaker et al., 2020). But it’s a curriculum – there’s no coaching at all. I’m mostly surprised it also got called My Teaching Partner.

Full list of mechanisms

  • Manage cognitive load – don’t overload teachers
  • Revisit prior learning
  • Goal setting
  • Credible source – provide evidence from a convincing person
  • Praise/reinforce success
  • Instruction – tell teachers what needs to be done
  • Practical social support – organise for colleagues to meet and offer practical advice
  • Modelling
  • Feedback
  • Rehearsal – deliberate practice outside the classroom
  • Prompts/cues – setting reminders
  • Action planning – planning action
  • Self-monitoring what you do
  • Context-specific repetition – repeating the desired behaviour

References

Clarke, B., Gillies, D., Illari, P., Russo, F. and Williamson, J. (2013). Mechanisms and the Evidence Hierarchy. Topoi, 33(2), pp.339-360.

Epton, T., Currie, S. and Armitage, C.J., (2017). Unique effects of setting goals on behavior change: Systematic review and meta-analysis. Journal of consulting and clinical psychology, 85(12), p.1182.

Gregory, A., Ruzek, E., Hafen, C. A., Mikami, A. Y., Allen, J. P., & Pianta, R. C. (2017). My Teaching Partner-Secondary: A video-based coaching model. Theory into practice56(1), 38-45.

Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., Eccles, M. P., Cane, J. and Wood, C. E. (2013). The Behavior Change Technique Taxonomy (v1) of 93 Hierarchically Clustered Techniques: Building an International Consensus for the Reporting of Behavior Change Interventions. Annals of Behavioral Medicine, 46(1), pp. 81-95.

Whittaker, J.V., Kinzie, M.B., Vitiello, V., DeCoster, J., Mulcahy, C. and Barton, E.A., 2020. Impacts of an early childhood mathematics and science intervention on teaching practices and child outcomes. Journal of Research on Educational Effectiveness, 13(2), pp.177-212.