Dr Wayne Harrison

Peer Tutoring - Summer Term 2 Pilots

In March 2022, the Department for Education (DfE) updated the pupil premium guidance for the start of the academic year 2022 – 23, requiring schools to select from a ‘menu of approaches’ for funding targeted academic support (DFE, 2022a). Within the targeted academic support, the following interventions can be used:

• Interventions to support language development, literacy, and numeracy
• Activity and resources to meet the specific needs of disadvantaged pupils with SEND
• Deployment of teaching assistants and interventions
• One to one and small group tuition
• Peer tutoring

The flagship government initiative for supporting learners impacted by the COVID pandemic involved the creation of the National Tutoring Programme (NTP) as the main catch-up strategy, with the government white paper (DfE, 2022b) stating “we will deliver up to 6 million tutoring courses, each providing 15 hours of tutoring, by 2024”. This targeted strategy is not a cost effective and sustainable model due to a range of challenges, which we will not go into here as they are well documented. The fundamental problem is the funding, with the subsidies reducing to 60% in the academic year 2022-23 and then reducing to 25% for the academic year 2023-24 (DfE, 2022c). Even before the recent unprecedented financial pressures in schools due to inflation, gas and electricity prices and the increase in pay awards, future funding of the NTP would still be an issue.

Why peer tutoring?

At WhatWorked we decided to focus on developing the evidence base for a strategy not widely used in schools. A rapid review of published pupil premium strategy statements for primary and secondary schools in three local authorities confirmed only 1% used or planned to use peer tutoring. The low uptake is surprising as the Education Endowment Foundation (EEF) Teaching and Learning Toolkit has highlighted this approach as a high impact, low-cost intervention that is based on extensive evidence for the last decade (EEF, 2022). The uptake of the intervention had not changed from a survey by the Sutton Trust in 2015 finding only 1% of schools using peer tutoring strategies (Sutton Trust, 2015).

Therefore, we decided to start developing our concept for a WhatWorked evidence base focusing on KS2 Mathematics cross-age peer tutoring programmes for schools. For the success of any intervention, implementation is key and this is highlighted by the EEF guidance report “Putting Evidence to Work – A School’s Guidance to Implementation (EEF, 2021). At WhatWorked, we developed a bespoke delivery programme to provide an overview of the research evidence, step by step guidance for implementing the programme, the resources and training materials and the option to evaluate the intervention using three types of strategies. Teachers were allowed the flexibility to use either teacher observations (1 Star), a pre- post test single group design (2 Star) or a mini-randomised controlled trial (3 Star).

For the 2 Star and 3 Star evaluations, teachers are provided a protocol outlining the rationale, research design and planned data analysis. We also require teachers to register in advance of starting the intervention, otherwise we are unable to complete their analysis. For the initial pilot of the Year 3 / Year 5 cross-age peer tutoring programme for multiplication and division, 30 teachers had enrolled in the programme with 9 pre-registering their evaluation in advance. Please note, we do not require schools who decide to use the teacher observation to register as this approach does not involve any data to be analysed. Three schools selected the mini-RCT and six schools planned to use the pre- post test single group design. For schools selecting the mini-RCT, the design of the evaluation allowed the control students to receive the intervention after the post-test to ensure no learners are disadvantaged by not having access to the support.

COVID and a heat wave

At the start of the summer term 2, we had optimistically hoped schools would have less disruption from COVID but as we have all found out over the last two years of the pandemic, the virus can return with even more highly transmissible versions and create major disruption in schools. From the initial nine schools, within two weeks six schools had postponed the delivery of the programme to the Autumn term due to staff and pupil absences. Furthermore, a heat wave in the last week of term prevented a school from delivering the final lesson and post-test. From the original nine schools, we had one school complete the mini-RCT evaluation and another the pre-post test single group evaluation.

Even though we have encountered challenges, the feedback from schools has been positive and we have proven that the online delivery model for supporting schools to plan, implement and evaluate an intervention worked. Furthermore, the average time for a teacher to complete the online course to set up the intervention in their school was just under one hour (48 minutes).

Developing an evidence base

The main advantage we have at WhatWorked is the use of an innovative approach to evidence generation, as we aggregate the data from the mini-RCTs to create a cumulative meta-analysis for each intervention. Therefore, even though our initial pilots had been impacted due to the disruption of COVID and an extreme heat wave, we are able to start the evidence base with data from a small number of schools. As more schools replicate, implement and evaluate the intervention in the next academic year, we will be able to update the overall effective size as the sample size increases. So, how do we analyse the data at WhatWorked?

Data analysis

The analysis we use shows the amount students improved compared to their starting point (Gain). If a control group is used it will, more informatively, show how much students improved compared to similar students not benefiting from the intervention, similar to asking ‘what would have happened if the students had not received the peer tutoring?

Our analysis reports the size of the impact (Effect Size) using a technique standard for education research. This shows the size of effect relative to ‘noise’ in the data. Effect size provides a number on the same scale as other educational interventions, allowing easy comparison of the effectiveness of different interventions.

Data is statistically analysed using techniques at the standard required for publication in peer reviewed research journals. Analysis of Variance (ANOVA) is used to show whether natural ‘noise’ in the data is a more likely explanation for results than the intervention.

To get a clear picture of the amount the intervention improves attainment, it is regarded as best practice to ‘filter- out’ differences between intervention and control students, for example one group is often slightly higher performing. Where a control group is used, Analysis of Covariance (ANCOVA) will adjust the Effect Size for differences between groups at baseline. In line with high quality reporting requirements of scientific publications, results are adjusted for distortion that occurs with small numbers of students. Furthermore, intention to treat will be used in all WhatWorked impact evaluations. 

How effective were the peer tutoring pilots?

The initial pilot focused on a Year 3 / Year 5 cross-age peer tutoring programme for the topic multiplication and division.

Pre- Post Test Single Group Design

The research design for the pre- post-test evaluation does not include a comparator group, therefore the effect size of the intervention should not be compared directly with the mini randomised controlled trials. However, the pre- post-test design does allow for the gained score to be calculated to show the impact of the intervention.

The initial intervention sample size is n=10 for the tutees and tutors. For the post assessment, 2 tutees and 3 tutors were absent. The analysis sample size for the school is n= 8 tutees and n= 7 tutors.

The evaluation found that, on average, the mathematics skills for the multiplication and division topic of pupils who received the intervention improved. The tutees effect size of 1.64 demonstrates that the average improvement increased and the intervention had a positive impact on pupil attainment. The effect size of 0.15 for the impact on tutors shows a smaller gain, however the wider benefits of peer tutoring will also provide additional benefits to the Year 5 pupils. 

Mini-RCT Evaluation

The initial intervention sample size is n= 25, n= 13 control and n= 12 intervention group. All students completed the pre and post-assessments, therefore we have no attrition or missing data for the school. The pre-test mean for the control group (n=13) is 8.27 and the intervention group (n=12) is 8.08. The post-test mean for the control group is 10.08 and the intervention group is 13.33.

The school completed the pre-assessments with the Year 5 peer tutors to help assess gaps in the knowledge prior to the start of the intervention. No post-assessments were completed with peer tutors so we are not able to assess the impact of the intervention for the peer tutors.

The evaluation found that, on average, the mathematics skills for the multiplication and division topic of pupils who received the intervention improved at a faster rate than the pupils in the control group. The tutees effect size of 1.79 is large and demonstrates that the intervention had a positive impact on pupil attainment. 

Limitations

It is important to note that the effect sizes for the mini-RCTs should not be compared with large scale RCTs due to a number of design features such as the small sample size, non-standardised assessment and short delivery timescale. However, as more schools complete the evaluation we are able to create a cumulative meta-analysis to provide a more robust overall effect size for the intervention. It is very encouraging to see large effect sizes for the pilots and overall the peer tutoring intervention has demonstrated that the intervention is a good bet for improving mathematics attainment in the context of these schools. But, we must treat these with caution until we start to receive the data from schools who run the intervention in the next academic year.

What did we learn from the summer term 2 pilots?

Firstly, we would like to thank all the schools who registered to evaluate the peer tutoring as we understand that the summer term 2 in primary schools can be challenging due to the other commitments in the calendar. The extra pressures of a new COVID variant and an extreme heat wave also provided unexpected disruption.

The second lesson learned is that a flexible online delivery model for supporting teachers to implement and evaluate an intervention is feasible. Thirdly, our initial data shows that if implemented correctly, structured peer tutoring can potentially have a positive impact on learning and be delivered at a very low cost.

At WhatWorked, we are now focusing on the new academic year with the aim of developing a robust evidence base for the impact of peer tutoring in primary schools. As peer tutoring is included in the targeted academic support in the ‘menu of approaches’ for pupil premium funding, we hope to support schools to build a multi-year strategy for embedding peer tutoring.

We are offering all primary schools the opportunity to pilot a free peer tutoring programme in the Autumn term, so please visit here to sign up to the free course:

https://interventions.whatworked.education/course/100-challenge-year3-free-multiplication-division-peer-tutoring

After the pilot, we have no commitments for the school to then purchase our annual licence (£275 per school).

References

Department for Education (2022a). Using the Pupil Premium: guidance for school leaders. March 2022.

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1066915/Using_pupil_premium_guidance_for_school_leaders.pdf

Department for Education (2022b). Opportunity for All: Strong schools with great teachers for your child.

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1063602/Opportunity_for_all_strong_schools_with_great_teachers_for_your_child__print_version_.pdf

Department for Education (2022c). National Tutoring Programme: guidance for schools. July 2022
 
https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1089182/NTP_Guidance_for_Schools.pdf

Education Endowment Foundation (2022). Teaching and Learning Toolkit.
https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit

Education Endowment Foundation (2021). Putting evidence to work – A school’s guide to implementation.
https://educationendowmentfoundation.org.uk/education-evidence/guidance-reports/implementation

Sutton Trust (2015). Pupil Premium: Next Steps Report.
https://www.suttontrust.com/wp-content/uploads/2019/12/Pupil-Premium-Summit-Report-FINAL-EDIT-1.pdf




Created with