Building robust evidence is challenging
Generating robust evidence in education is costly, time consuming and often generates results that are uninformative. This creates challenges for large educational foundations and Edtech companies who are looking to build evidence for what works.
“Understanding why educational RCTs often yield small and uninformative effects should be seen as a priority for our field.”
Our Solution
WhatWorked Education provides a comprehensive service designed to facilitate and run small-scale robust evaluations using micro-randomised controlled trials (mRCTs).
The problems we solve:
Cost
Our evidence platform empowers teachers to independently set up and conduct these evaluations within their schools.
The problems we solve:
Cost
One of the key advantages of the platform is the potential savings in costs in pilot evaluations, or in testing aspects of an intervention to optimise the components before scaling to large scale RCTs.
Time
Potentially reducing risk of testing large-scale interventions that are likely to produce uninformative results, saving the costs, time and resources involved in running these.
Key features of our service
Automated Analysis and Impact Reporting
The platform streamlines the research process by supporting teachers in randomising their pupils, following a protocol and entering pre and post test data. The platform then runs an automated ANCOVA to calculate an effect size.
Cumulative Evidence Base
The data feeds into a cumulative meta-analysis to track the overall impact of the different interventions.
Scalability
Ideal for testing the promise of interventions, our platform helps determine if they are ready to scale to large-scale RCTs.
Edtech Client Spotlight: Eedi
We are supporting Eedi to develop an evidence base for impact and testing the effectiveness of their product in schools. Take a look at the impact report published in June 2023 to see how we are using small scale RCTs to develop a cumulative evidence base.