Six months into a project, we have to choose: can our research in South Wales go ahead?  We have just one school visit booked: interesting, but will it allow us to say anything about the area as a whole?  We have no further leads, but we have committed to study five communities and one of them really should be in Wales.  What do we do next?

This dilemma encapsulates two constant concerns throughout a project over the last year: sampling with integrity and convincing schools to welcome us.  Most research discusses access to schools unproblematically; negotiations and problems are almost never mentioned, we just read that: ‘Schools in the district were invited to participate’.  If you’re lucky, the appendices may include a copy of the invitation.  If only it were that simple.  A favourable response is just the beginning: you still need to convince a head that the project is viable and arrange the visit itself.


I’ve wondered what we’re doing wrong many times in the last year, but I’m told that this is an endemic problem.  This post is a personal reflection on how we’ve responded; I hope what we’ve learned may be useful to other researchers, and that it may elicit further advice.  I can’t promise our approach is right: it’s a collection of expedient compromises combining behavioural insights, best guesses, and trial and error.  Underlying these compromises has been the desire to achieve a wide, representative sample and the need to visit schools to a deadline.

Sampling: ideal and reality

Good researchers sample carefully.  Our goal is to create ethnographies which explore young people’s needs: we want to know that any conclusions hold true in primary and secondary, large and small, academy and maintained schools.  This is why randomised controlled trials form the gold standard: schools are randomly selected for interventions.  We didn’t set out to achieve this level of rigour: schools are too busy, heads are asked to do too many things, and participating in research often offers individuals schools little in the short term, valuable as it can be in the long term.

Instead, we sought to identify schools which would be interested and have time to welcome us.  We wanted a broadly representative range of schools which are ‘Teach First eligible’ (a measure based on the demographics of the student population).  We began by asking colleagues with local knowledge to suggest schools and individuals who might be willing to speak with us.  Where they couldn’t help, we asked personal contacts and friends of friends.  (Existing contacts and prospects were also a factor in selecting the areas we chose to study from a shortlist).  From the strictest point of view therefore, we invalidated our research from the outset.  (Among other things, I suspect this led us towards heads with particularly strong vision and the capacity to look beyond the immediate needs of the school – and so may mean we have under-sampled the most challenging schools).

Convincing headteachers

Our initial message sought to convey three things:

  1. The value of the study, to the school and more broadly
  2. The limits to our request
  3. The protections we offered (anonymity for the schools, sensitive questioning, DBS-checked researchers)

I could write thousands of words about each aspect, but anything which can’t be said in a few lines to busy heads almost deserves to be ignored, especially in an introductory message.  Here is a late and lengthy iteration:

Teach First are conducting research, hoping to understand the experiences and successes of young people in Hackney.  Specifically, we are keen to understand what leads young people from poorer backgrounds to succeed in education and life, and the barriers that remain to their success.

We hope to answer three questions:

  • What characterises growing up in the area?
  • What barriers do young people face growing up in the area?
  • What advantages and opportunities do young people in the area have?

The aim of our project is to ensure Teach First’s work meets local people’s needs, to share what we learn with policy-makers and schools, and to give local people a greater voice in policy-making in their area.  We will share our findings with organisations that took part in the research, within Teach First and publicly (but all contributors can remain anonymous).

We hope to discuss these questions with students and teachers at your school.  If possible, this would include:

  • Speaking with two focus groups of 5-6 students in Year 7 and Year 10.
  • Speaking with the head, inclusion manager, and if possible a small group of teachers.

Each interview/focus group would last no longer than 30 minutes, and can be fitted around the school day as is convenient for you.

Follow-up conversations are partly a chance to listen: heads’ interests, needs and anxieties vary, and they may begin to tell us about the area itself.  If they’re interested, we can expand on the subjects in the initial email: common questions have focused on the questions we’ll ask and how exactly the research will be used.

I suspect heads have been motivated to welcome us for three reasons.  Many have seen value in research in their area, whether for their own use or to increase awareness of the area’s needs.  A few mentioned the opportunity this provided for students to discuss these issues.  Some heads participated due to their strong existing relationship with Teach First.  Finally, I think some took part due to personal appeals, from us or from other heads.

Confronting failure

At different points, three area studies seemed poised on the brink of failure: the situation in South Wales was merely the most acute.  Sometimes it proved impossible to convert initial interest into successful visits: schools are busy, logistics can be tricky, people stop answering emails.  We faced other challenges too: few contacts in one area, a lack of local support in another, an active decision not to support the project in a third.  Facing this situation, what next?


Looking elsewhere

I just about caught the ‘sunk costs fallacy’: continuing with Plan A because we had begun it.  In South Wales, with only one school booked, we returned to our shortlist and tried a different area.  When our work in Hackney faltered, I spotted another opportunity which meant we might be able to compare two areas in London – different, but perhaps more interesting.  In both cases, Plan B strengthens the study.

Snowball sampling

Snowball sampling – asking one interviewee to recommend another – can be a valid research method, but we didn’t set out to use it.  In South Wales however, at the end of our single school visit, the wonderful head asked if there was anything more she could do to help our work.  I asked if she would be willing to email the secondary heads network asking for more participants, and by the end of the day we had two more schools signed up.  More simply, describing our tribulations to colleagues and friends has often elicited further suggestions.

Accepting and pressing ahead

I found it helpful to accept that we weren’t going to achieve a perfect sample anywhere.  This made it easier to press ahead, conducting visits where we could – even if they looked insufficient – and continuing to seek more opportunities.  Ultimately, despite the challenges we’ve faced, we’ve visited over twenty-five schools and youth groups and have talked to young people, professionals and parents.  The final reports, even based on imperfect and incomplete samples, are a lot better than nothing.

The process is part of the story

As my introduction hinted, I find the obscurity in which the messy reality of research a little frustrating.  Rather than a whitewashed picture and the pretence that we achieved a perfect sample, the differences in welcome we experienced are part of the findings.  For example, I might speculate that the warm welcome we received in Blackpool, and the limited interest in Hackney, reflect the differences in attention the two areas have received.

A framework for conducting research in schools

Once you have chosen who you wished to contact, my colleague and I have found the EAST framework used by the Behavioural Insights Team helpful in planning our approach.

Image: Veronica Villa Agudelo

We have sought to make the process:


  • Brief, clear communications
  • Clear, limited requests (30 minute focus groups fitted around the school day)
  • Final confirmations which merely checked we were expected, rather than reminding schools of everything we’d initially asked for (no visit ever goes quite to plan, on which more in a future post)


  • Showing the project is worthwhile
  • Emphasising anonymity for schools and interviewees


  • Reaching one person via another ‘X passed on your name suggesting…’
  • Asking heads to talk to one another (this has tended to elicit the quickest and most successful results)


I’ve not tracked our communications closely enough to reach any clear conclusions about when is best to get in touch – different heads seem to have very different preferred times to respond to emails.  I do think there’s merit in Katharine Burn’s counter-intuitive point, at ResearchED15, that September is a great time to ask people, before routine is fully established.

In a sentence though, start early, and accept that small-scale research among volunteers has natural limits.

The first ethnography – of Blackpool, can be read here.

I’m immensely gratitude to all the heads, colleagues and friends who made the project possible.