Over the last two years, we’ve experimented with various models to help teachers evaluate interventions in their schools using micro-randomised controlled trials (m-RCTs). One recurring piece of feedback stood out: time. Teachers consistently cited the lack of time to set up trials as the biggest barrier.
In response, we reimagined our approach. We developed a platform to streamline the entire research process. This included brief, clear instructional videos explaining the intervention, downloadable research protocols detailing how to conduct the project, and automated analysis and reporting tools to assess the intervention's success. By the time we piloted our third iteration last summer, the results were encouraging. Teachers not only completed the trials successfully, but they also found the process manageable. We had finally made the trials accessible for teachers - an essential step in scaling evidence-based practices.
After the event, I caught up with Jane over coffee. We discussed the idea of identifying schools in the region that are ‘bucking the trend’ in literacy and numeracy progress. These schools could help uncover teaching "micro-practices" contributing to their success. The plan was to turn these practices into trials to be tested on the platform - available in a private area for other schools in the county to evaluate in the context of their own school.The beauty of micro-RCTs lies in their power to aggregate data. Each teacher-run trial typically lasts six weeks, and the results feed into a cumulative meta-analysis. A cumulative meta-analysis is a method that allows us to continuously update our understanding of an intervention’s effectiveness as new data becomes available, refining our insights as more trials contribute to the dataset. This enables us to track how the impact of an intervention evolves over time and across different school settings, identifying patterns that might not be evident in smaller, isolated studies.
By leveraging this approach, we gain a clearer picture of how different interventions perform across a variety of school contexts. More importantly, it fosters a dynamic, evidence-driven environment where teachers play an active role in generating and interpreting research. I’ve always believed that empowering teachers as co-creators in the research process is the key to engaging the profession in evidence-informed practice. Now, with our platform and Durham County Council’s collaboration, we had the foundation to scale this innovative approach, making rigorous, real-time research accessible to schools in a way that was both practical and impactful.
We took this concept - building a regional evidence base - to the North East Combined Authority. With support from Education Durham, we were fortunate to secure a grant to pilot the initiative in the spring and summer terms. This model is a global first: using teacher-led micro-trials to develop a cumulative meta-analysis. More importantly, it offers a chance to scale interventions based on robust evidence, enabling schools to identify what works best for them and improve learning outcomes for children in the region.
In December and January, we worked with a small group of schools to identify approaches and interventions that could be replicated more widely. The solutions they suggested were surprisingly straightforward - not driven by EdTech or overly complex. For example, one intervention focused on Year 1 number fluency through sequenced flashcards and retrieval practice. Another involved a simple visual framework to scaffold 2-3 mark inference questions in reading. These interventions are low-cost (often free) and flexible enough to be delivered at the class level or as targeted small-group support by teaching assistants.
We’ve also listened closely to feedback from teachers. One consistent piece of advice from last summer’s pilots was that reporting effect sizes as the main outcome measure wasn’t accessible. In response, we’ve moved effect sizes to a technical appendix and now emphasise percentage change as the primary outcome in the automated reports. We hope this small change will make a big difference in how teachers interpret and use the findings.
This past week, we launched the spring term 2 pilots with primary schools in County Durham, and the feedback has been overwhelmingly positive.
One email from a headteacher stood out:
“Really enjoyed this afternoon, and it was refreshing to find something that looks doable on so many levels.”
Getting to "doable" has been a long journey, but it’s a milestone worth celebrating. By making it easier for teachers to engage in evidence-building, we’re taking an important step toward scaling innovative, research-driven practices across the North East - and, eventually, beyond.