How can we know whether practice-based teacher education is working? If our training includes practice, teachers do what they are learning during a training session, rather than seeing and discussing what success looks like but trying it for the first time in front of students. I’ve argued that all teacher training should include practice: it’s crucial if teachers are to act on their good intentions under pressure in the classroom. I’ve shown how I plan practice. But once we’ve started using practice in training, how can we know whether it is helping teachers teach better?
At the TeacherSquared Teacher Educator Institute on practice this summer, Brent Maddin suggested three steps to effective practice-based training:
This model closely reflects my experience introducing practice-based teacher training:
- Practice happens – In 2013, I began including practice in training, described here. With experience, and visits to Teach Like a Champion workshops, I refined my approach, adopting a coherent planning structure which meant practice was aligned to clear goals. Sessions were not running as smoothly as a I wanted however: I began refining my facilitation.
- Practice shines – Having refined my planning, I tried to create a culture of practice in which teachers feel comfortable and enthusiastic practising: I tried to make practice irresistible. I also codified the facilitator’s role in planning, delivering and facilitating practice and attempted to ensure peer feedback supports improvement.
- Practice matters – I could not claim, however, that practice was having the intended impact on teachers’ classrooms. The evidence I saw, and what I heard from trainees’ coaches, forced me to consider why teacher training doesn’t stick. In ensuring practice happens and shines, I had been refining my teaching, not necessarily improving teachers’ learning.
Brent’s framework suggests examining the impact of practice on teaching by seeking evidence during practice sessions and from teachers’ classrooms. This post considers how we might do this. It may helpfully be considered in terms of Guskey’s levels of impact:
We collect teachers’ reactions (Level 1) and sometimes their learning (Level 2). Can we do more to examine how teachers’ practice has changed (Level 4)?
What evidence can we collect during practice?
Previously, I would circulate around the room during practice, observing. My priority has been ensuring practice happens: encouraging teachers to practice and supporting them if they are uncomfortable, confused or have become engrossed in discussion. If everyone is practising, I’ve looked for general points I can offer as feedback: “I notice lots of people jumping in after student responses – try counting to five, to increase your wait time.”
The Teacher Squared institute convinced me that we can be more focused and intentional in what we look for while teachers practice. This is not unlike the distinction Doug Lemov describes between hunting and fishing: rather than circulating to see what we notice, we can look for specific aspects of successful practice, or likely misconceptions and difficulties. We can collect three kinds of evidence:
Quantitative data: We visit each group and simply record whether teachers’ practice includes the success criteria:
- Does the teacher give a rationale for the activity?
- Are instructions sequential?
- Does the teacher pause between instructions?
Qualitative data: Whether ‘instructions are sequential’ is a simple yes/no question; whether those instructions are sufficiently clear is a qualitative judgement. Whether teachers have done things well is much harder to tick off, but we can usefully note strengths and weaknesses:
- Motivating and inspiring aspects of the rationale for an activity
- Body language which clarifies instructions
- Facial expressions and body language which show that a pause invites questions
Culture of practice: We can also look for evidence that teachers are developing a culture of practice:
- Do teachers focus on practice and stay in role?
- Is practice is prioritised during the time?
- Do teachers take responsibility for managing the practice?
The sheet below shows our first, improvised attempt to gather evidence about how practice was going during the institute. We ticked and crossed quantitative points – where we saw teachers including key points – and we added notes about qualitative features and the culture of practice we saw.
Collecting evidence of what teachers are doing during practice means:
- We can give teachers clearer feedback about trends in their practice. This can be quantitative: “All the groups we visited were offering a rationale for the activity, but we didn’t always see models – these can be critical in clarifying what the goal is, so when you review your plan, please ensure you have a model to explain.” Or it can be qualitative: “One strength we saw as you closed activities was the way you linked them to future learning.”
- We have a more objective sense of how effective our training has been: “The modelling section was the weakest: we need to revisit that in our next session.”
We should then be able to tie this to the impact training has had in schools…
What evidence can we collect in classrooms?
Practice should help teachers teach more effectively in the classroom. To test whether this is happening, we need to connect what we see in practice with what we see in the classroom: if we’ve watched our teachers practise using clear, economical language in their explanations do we see them use clear, economical language in the classroom?
Connecting teachers’ successes in practice with their successes in the classroom would show that practice-based training was working. Identifying gaps – where teachers’ success in practice does not translate to success in the classroom – would help us review our training and support: we might need to practise a technique again; to practise applying a technique in ways which better simulate the classroom; or we might need to examine other barriers preventing teachers from using what they have practised in the classroom, such as confidence or judgement of when to use them.
This is relatively easy if you are a head of professional development and you can observe both practice and teaching; it is more complicated if you are trying to coordinate teacher educators inside and outside school. But the evidence exists already: as a tutor for Ark Teacher Training, I could see the actions steps coaches gave my trainees to focus on each week; likewise, coaches could see what training trainees had attended. We are not short of evidence on how teachers or trainees are doing in the classroom: we are short of evidence on their practice and we do not yet do enough to connect what we see in practice and what we see in the classroom. Doing so could tell us whether practice is helping teachers teach better, not just practise better.
We need to assess the impact of practice-based teacher training, no matter how plausible it seems, just as we need to assess what our students have learned, no matter how well we are teaching. Intentionally collecting evidence during practice-based training, and connecting it to what we see in the classroom, could be a powerful way to support teachers better, and refine our teacher education. Moreover, since teacher learning is just learning we could do the same as teachers: hunting for signs that students are succeeding or specific misconceptions. rather than fishing in the hope that we learn something useful.
What should I read next?
Planning practice-based training – shows the process I go through in planning a practice-based training session.
A framework for designing and facilitating practice-based training: summarises what I try to do to make practice-based training work.
Making practice-based training comfortable and irresistible: some recent thoughts on creating a culture of practice.
I’m grateful to Brent, Kaycee, Liam, Christian, Sona, Daya and Randall at the Teacher Squared Institute, from whom I learned an enormous amount; I’m also grateful to Teach for All, which sponsored my attendance at the institute.