Say ‘No’ to professional development.  That’s the obvious conclusion to be drawn from a series of recent, rigorous, randomised-controlled studies of professional development programmes.  Unlike so much professional development, they incorporated all the ingredients suggested by reviews of the evidence (such as Desimone, 2009), including expert input, leadership support, a subject knowledge focus and lasting a year or more.  Yet despite thoughtful design, careful implementation and significant investment, these programmes led to no discernible improvements in student learning.

A brief synopsis.  Each study was a randomised-controlled trial, providing a comparison group of similar teachers who did not experience the professional development – allowing clear causal inferences to be made about the impact of the training.  Garet et al. (2016) found that summer training, video coaching and professional learning communities “improved teachers’ knowledge and some aspects of classroom practice but did not improve student achievement”.  Garet et al. (2011) found that summer training, coaching and follow-up seminars over two years changed neither teacher knowledge nor student achievement.  And Jacob et al. (2017) found three years of problem solving and examination of student work and misconceptions led teachers to evaluate the programme positively, increased their knowledge slightly but left their teaching unchanged.  Before we reject professional development entirely however, I’d like to suggest five ways these studies could help us to refine the way we design professional development.

1) Focus on the right knowledge

Focusing on the subject to be taught is important, but some aspects of this knowledge are more powerful than others.  Programmes tended to focus on common content knowledge – general knowledge about the subject – and specialised content knowledge – such as different ways to solve maths problems (see Ball et al., 2008).  They focused less on the specific content and problems teachers were due to teach: one programme, for example, taught maths useful across elementary school, even though the study participants were fourth grade teachers (Garet et al., 2016).  Developing teachers’ general subject knowledge is a massive, vague task: teacher general knowledge is valuable, but teachers may well not transfer it to their teaching.  Professional development focusing on teacher knowledge should therefore be designed around the curriculum: it may fill gaps in teacher general knowledge, but the real gains will derive from understanding different ways students approach problems, their misconceptions and useful models (Ball et al. (2008)’s content knowledge for teaching and content knowledge of students).

2) Get inside the black box

Yet teachers’ practice is far more important than their knowledge.  The association between teachers’ knowledge and student learning is so weak that developing it may be a distraction:

To obtain an impact on [student] achievement of 0.1 standard deviations (the equivalent of moving a student’s test score from the 50th percentile to the 54th percentile), for example, would require an improvement in [teachers’ knowledge and] practice by about 1 standard deviation (the equivalent of moving a teacher’s practice from the 50th percentile to the 84th percentile) (NCEE, 2016).”

Professional development may influence isolated aspects of teachers’ knowledge and behaviour, but doing so is unlikely to cause change in the existing, complicated patterns of teacher classroom behaviour.  It is possible to influence teacher behaviour: a striking counter to these unsuccessful programmes is Allen et al.’ study (2011), which led to significant impact on student learning through video coaching about classroom climate.

3) All teacher-training should be practice-based

In the same vein, talking about practice is no substitute for practice.  Here’s a typical activity from one of the programmes:

Teachers sitting in small groups were given a container of rice, tape, and three 5- by 8-inch index cards. The teachers were asked to make three shapes with the index cards—a circular cylinder, a triangular prism, and a square prism—by taping the two five-inch ends together with no gap or overlap…”

They were then asked to predict which shape had the largest volume and test their hypotheses by filling the shapes with rice: “Upon doing so, many were surprised that the circular cylinder actually held more than the other two shapes.”  They tried to work out the shapes’ volumes:

Neither group made progress toward a reason for their outcome. The whole-group conversation that followed focused briefly on the mathematical reason for the cylinder’s larger volume (the height was the same in all three shapes, but the areas of the bases differed) and then covered topics such as when to use the task during the school year, how long it should take to enact, and other desirable features of the task.”

Like so much teacher education, these programmes embraced discussion.  Teachers can discuss student learning until the cows come home, with insight and erudition, but it’s a waste of time unless they practise behaving differently in the classroom: insight without action is indulgence.  (This point was made by just one of these studies (Jacob et al., 2017)).  All teacher training should be practise-based.

4) No matter how strong your external training is, culture is stronger

Each programme included a summer institute of a week or so, followed by interventions in schools, of four to six days in a year.  While external expertise can be a powerful challenge to existing ideas (Teacher Development Trust, 2015), it’s telling that some of these studies randomised at teacher level (within schools) and ruled out spillover: which is to say, they concluded that the training some teachers received would not affect other teachers teaching the same grades in the same school (Jacob et al., 2017).  This makes the intervention look particularly shallow: it is in the school or the department that sustained and sustainable improvement needs to occur.  Similarly, placing most of the training before the school year is administratively convenient but limits the ability of participants to learn, applying their learning and then return to learn more.

5) Design your CPD around reality in schools

The value of focusing on the school is reaffirmed by the massive turnover of teachers these programmes suffered.  One reason they struggled to demonstrate an impact was their ‘intent to treat’ approach: if a school, or teacher, was included in the trial at the start, their results counted at the end – even if a teacher missed every training session.  Turnover was dramatic: only 57 of 105 teachers remained in Year 3 in one study, (Jacob et al., 2017); only 23 of 45 maths teachers remained in Year 2 in another (Garet et al., 2011).  Yet such turnover is not unusual, and while we may wish politicians took this more seriously, we need to design professional development to take account of this: probably by focusing on the department and revisiting key ideas each year.  (Jacob et al. (2017) also suggests that planning to ensure leadership support is sustained is similarly important).

6) Evaluate properly

We know very little about what’s working: a recent review found only 9 of 1,343 trials of professional development with randomised-controlled trials or quasi-experimental designs (Garet et al., 2011).  The startling thing about these studies is that there is a good deal of plausible professional development which isn’t working: we need to investigate this more closely; randomisation should be the rule, not the exception.


Teachers: don’t say ‘No’ to professional development, but do examine what’s being offered very carefully before investing in it.

Teacher educators: incorporate (or improve upon) the points above.


Allen, J., Pianta, R., Gregory, A., Mikami, A., Lun, J., (2011) An Interaction-Based Approach to Enhancing Secondary School Instruction and Student Achievement. Science. 333 (6045) 1034-1037

Ball, D. Thames, M., Phelps, G. (2008) Content Knowledge for Teaching: What Makes It Special? Journal of Teacher Education 59(5) 389-407

Desimone, L. (2009) Improving Impact Studies of Teachers’ Professional Development: Toward better Conceptualizations and Measures. Education Researcher 38(3) 181-199

Garet, M., Wayne, A., Stancavage, F., Taylor, J., Eaton, M., Walters, K., Song, M., Brown, S., Hurlburt, S., Zhu, P., Sepanik, S., Doolittle, F., Warner, E., (2011) Middle School Mathematics Professional Development Impact Study: Findings After the Second Year of Implementation. Institute of Education Sciences.

Garet, M. S., Heppen, J. B., Walters, K., Parkinson, J., Smith, T. M., Song, M., Garrett, R., Yang, R., & Borman, G. D. (2016). Focusing on mathematical knowledge: The impact of content-intensive teacher professional development (NCEE 2016-4010). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Jacob, R., Hill, H., Corey, D. (2017) The Impact of a Professional Development Program on Teachers’ Mathematical Knowledge for Teaching, Instruction, and Student Achievement. Journal of Research on Educational Effectiveness, DOI: 10.1080/19345747.2016.1273411

NCEE (2016) Does content-focused teacher professional development work? Findings from three Institute of Education Sciences Studies. NCEE Evaluation Brief