Since being founded in 2018, ImpactEd has grown to help over 150 organisations and 1,000 schools better measure their impact. Having recently relaunched as ImpactEd Evaluation within ImpactEd Group, we’re committed to sharing more on what that data can tell us about how best to support young people.
Using evidence to inform decision making is not an academic exercise. We recently published the story of Jamal, a student at Miltoncross Academy whose educational journey has been transformed through support from a 1:1 mentoring initiative. His story is not unique: the best educational interventions really do make a difference. But for time-poor professionals, effective evaluation is easier said than done. We want to change that.
Over the course of this year, under the continuing theme of Making It Count, we will share regular insights from the evaluations we support, signposting effective practices for practitioners and policymakers to consider. Sign up to our newsletter for notifications of future releases.
Priorities in the education sector: what does data from our platform tell us?
At ImpactEd Evaluation we’ve recently analysed trends from the data collected through our School Impact Platform, our digital tool helping schools and those who work with them to better understand their impact. (See methodology below for further details). This data represents over 1,150 schools and organisations, so provides valuable insights into changing priorities in the education system over the last five years. It includes insights from individual schools and Trusts as well as that derived from strategic partners such as Challenge Partners.
With data back to 2018, we can paint a picture of how the focuses of schools and organisations have varied over time. In some ways however, what is most notable is what hasn’t changed:
Of all areas our partners evaluated, pupil wellbeing and mental health was the most common primary focus. This represented 767 unique evaluations, nearly 13% of all those included in our dataset.
Other priorities stated included reading and literacy (7% of evaluations conducted); behaviour (6%) and STEM (4%).
The dataset also reflects our strong involvement in evaluations of tutoring (9%) and enrichment and outdoor education (9%) among other areas
This data reflects the changing educational landscape over time. For example, wellbeing focused evaluations constituted 26% of all evaluations on the platform in 2019/20 as schools monitored the impacts of the March 2020 lockdown. Although still a concern for many schools, in 2022/23 the number of evaluations focused on wellbeing returned to 11%.
2) Measurement matters: looking at other indicators of success
As well as considering what interventions organisations evaluated, we also looked at what measures of impact they used beyond traditional academic outcomes. Wellbeing was the most consistent measure, but beyond this pupils’ motivation, metacognition and self-efficacy were the most popular areas.
With over 70% of evaluations on the School Impact Platform having used at least one measure of social and emotional skills, it is clear that schools and organisations value the ability to look at educational impact holistically.
Specific skills of interest do vary year on year. In the 22-23 academic year, pupil motivation and school engagement saw the largest increases in usage relative to the previous year. This aligns with an increasing focus on attendance and engagement - on which more below.
3) Attendance and behaviour: the challenge of engagement with education
Excluding areas where we have a number of national partnerships (e.g. outdoor learning) our most notable increases in evaluation focuses were in attendance and behaviour. While this will be influenced by developments on our School Impact Platform, nearly five times as many evaluations took place on the School Impact Platform in 2022-23 compared to 2021-22. For behaviour, it was around 2.5 times as many.
Our Understanding Attendance project has been developed to address precisely this need: equipping schools to better understand the drivers of pupil absence in their settings and develop effective, targeted strategies. But there are no easy solutions - the rise in evaluations of attendance and behaviour, and the increase in popularity of our school engagement measure (used in 21% of all evaluations on the platform last year) illustrates a fundamental challenge in pupils’ engagement with education. We will be sharing more soon on how pupil belonging affects engagement and implications for educators.
4) How you do it: evidence and implementation and the role of technology
Across all evaluations one of the lessons of our data is variability: how you approach the intervention is at least as important as what it is.
This is perhaps most pronounced in our data on the use of education technology. Our Evaluating Edtech paper shows a number of edtech programmes where evaluation has shown statistically meaningful impacts. However, this generally only occurs where implementation and intended impacts are clearly planned for by ensuring:
Clearly defined outcomes of the use of any edtech tool
Use of appropriate measures to assess impact, including validated scales and feedback tools where end of year exam results may be too lagging as an indicator
Taking account of accessibility issues and deliberately planning to level the playing field
This is backed up by the external research: analyses such as the EEF’s review of edtech show positive but small effect sizes, with the most notable finding being the range of impact outcomes recorded. Edtech companies are increasingly aware of this - our Talking Impact research series recently highlighted how organisations can better evaluate their products.
Watch this space - what next?
This data provides insights into the priorities for schools and education leaders. Most important, however, is working together to address the challenges our data suggests.
Under the continuing theme of Making It Count, we'll be doing this in two main ways.
First, we'll work closely with our strategic partners to provide practical case studies and solutions. As a first step, we'd recommend consulting Challenge Partners' thematic review of the approaches their 550 partner schools have taken to collaboration in challenging circumstances. Showing similar themes to our data, the review also provides practical school-level examples of solutions to these enduring concerns.
Second, we'll continue to share further insights from our evolving dataset through events and regular publications. Our next release will focus on lessons we are learning on addressing pupil absence - sign up to our newsletter to be notified when this goes live.
All data taken from the School Impact Platform, our digital tool supporting better quality monitoring and evaluation for schools and those who work with them.
Data was analysed for the last five years, using keywords to draw out common themes in the areas evaluated through the platform
The data included is from 6,075 evaluations in 1,177 schools. These schools are those working with ImpactEd either directly or through our partner education organisations, so may not be nationally representative.
Evaluations involved pupils of all ages, although there were more evaluations at secondary than primary level with 2,489 evaluations involving primary age pupils and 3,806 evaluations involving secondary age pupils.
30% of pupils involved in evaluations were eligible for pupil premium and 18% were pupils with SEND status. This compares to the national average of 27% of pupils who are eligible for pupil premium and 13% of pupils who have SEND status.