top of page
Search
Writer's pictureClaire Green

Making Data Drops Work for Students and Staff


Whilst admittedly not the most scintillating of topics, ‘data drops’ are an ever-present feature of school calendars, and so ensuring they are utilised in order to support optimal student progress is key. Whilst sixth form leaders work with a whole range of data, including attendance and destinations, this blog post aims to set out a methodology for timely and effective use of internal curriculum data drops, to ensure effective interventions can take place between calendared data points.

 

What is the purpose of data collection in post-16?

·       To enable staff to support optimal outcomes for students

This is surely the fundamental reason to gather regular data throughout students’ time in the sixth form.  The gathering of accurate and regular attainment and progress data should allow sixth form staff teams to support students to maximise their chances of success in each of their chosen courses.

·       To measure impact

Data can also help to assess the level of impact actions taken by staff and students have had.  Assuming data is compared over time, it is possible to ascertain the positive or negative effects of interventions and initiatives.

·       To update stakeholders

As sixth form leaders, we are often asked to update stakeholders – this could be providing updates for staff, the Senior Leadership Team, governors, or in preparation for internal reviews or external inspection.

 

What data should we collect?

 

If you are new to role, and have the opportunity to have a say in the creation of data systems in your setting, I think this is a really important question.  In my view, the following considerations should be made:

 

·       Mirroring published headline measures nationally

Ultimately, the Department for Education publishes headline data for 16-18 providers.  These ‘performance measures’ have varied in recent years due to the impact of Covid and therefore ensuring your staff are aware of what will be published nationally is advisable.

·       Data that will allow staff to intervene most effectively

If the purpose of data collection is to optimise student performance, then we must ensure the data we collect is useful in this process and allows staff to identify students who are underperforming most significantly, in order to target interventions strategically.  Staff time is precious, so knowing which students require additional support means their time can be directed to ensure optimum impact.

·       Comparisons of data over time

If one of the key purposes of data collection is to measure impact, it is essential that comparisons are made over time.  This could be from one data point to the next, or between cohorts, for example comparing your 2024 mock exam results with last year’s final results or last year’s mock results.  Comparing over time allows updates to stakeholders to have more impact – particularly if you present this visually, with colour-coding to show improvements/decline.

·       Use of data analysis tools, e.g. Sisra, 4Matrix, FFT Aspire etc

There are now a whole range of data analysis tools that schools can utilise to interpret and analyse data.  Whichever ones your school has access to, it is important you utilise them to ensure you have a granular understanding of all aspects of the data you collect.  They take the hard work out of the analysis so, whilst getting to grips with different platforms can be time-consuming at first, it is certainly worth it in the long-run.

·       Attainment, progress and learning indicators (effort etc)

Given that performance measures at 16-18 include both attainment (average grades) and progress (Level 3 Value Added is being published again this year having been absent since 2019 due to Covid), it is important that our analysis covers these aspects too, so we have a real-time understanding of our potential outcomes throughout the students’ sixth form experience. Gathering effort data in some form is also key – cross-referencing effort with attainment and progress often allows staff to identify impactful actions for improvement.

·       How often do you collect data for each year group? 

There is a balance to strike here: overly-frequent data collection means there is insufficient time to implement actions in response to data; infrequent data collection could mean students’ underperformance goes unchallenged for too long.  We have opted for three data drops per year in both Years 12 and 13 (all in-class assessments except data point 3 in June of Year 12 and data point 2 in January of Year 13, which are full suites of mock exams).

 

Who sees the data and how is it presented?

Simplicity here is key.  Consider what each group wants from the data and present it for them as clearly and concisely as possible.

 


For this reason, I have one format of presentation for staff, our Senior Leadership Team and governors and a much simpler, grid-style format for students and their parents/carers.

 

·       Staff, SLT and governors:

These stakeholders should have an understanding of the global data set, in line with the published performance measures.  Therefore, at each data drop, I go through the following process:



 

To create the PowerPoint, I export the relevant data sets from Sisra then edit them in Excel to streamline the data, focusing on headline measures (in line with published performance measures), key groups (e.g. SEND, Disadvantaged, Prior Attainment etc), specific students in the SEND and Disadvantaged categories, Subjects, and Individual Students (highlighting those most underperforming across subjects in particular).  This consistency of communication across stakeholders means that all staff have a shared understanding of the key areas of focus. The ‘Next Steps’ slide gives subject leaders and their line managers the required follow-up actions and prompts for discussion in meetings, again, to ensure a level of consistency. 

 

Screenshots of the PowerPoint slides I create are shown below on the pdf, with all data removed.  I deliberately compare the current data drop with the last one throughout the PowerPoint and show both average grade and VA (even though VA is not currently accurate as Sisra is still calculating on the 2019 coefficients for this, it still allows us to compare students within subjects in terms of identifying those underperforming most, and means we can see progress within a subject/group over time). Sisra helpfully offers a ‘compare’ feature which then colour-codes the data green or pink, depending on whether it has improved.



Subject leaders then complete a Google form in response (see the image below) – this means I am sure that all subject leaders have reviewed their data and have responded to it, specifically to identify actions they are taking as a subject in order to improve outcomes.  There is a question on the form to ask whether subject staff require support from the sixth form team/SLT – this allows me to filter the responses for those who do and implement actions in response.  The key aspect of this is that it requires no additional meetings outside of regular line management meetings.  If I had to meet with all subject leaders, it would involve 25 meetings at each data drop, which is clearly not a smart use of anyone’s time!  The outlined approach means that meetings can then take place if a subject has shown decline in outcomes over time or there are any concerns or changes on the horizon. The Google form responses can be downloaded as a spreadsheet which allows the sixth form team and SLT to gain an overview of the whole Key Stage 5 curriculum and the interventions taking place.  Subject leaders are also automatically sent a copy of their responses which they can upload to their Subject Improvement Plans to evidence their actions in response to the data, so this avoids duplication of work for them too.

 

·       Students and parents/carers:

These stakeholders have a very different need when it comes to the data shared – they want to know, in clear and simple terms, how the student is doing at each data point.  When we communicate home, we therefore report on:

o   Attendance to date

o   Minimum and Aspirational target grades: as a whole school, we use FFT20 and FFT5 for our target setting.

o   Most recent assessment grade

o   End of course prediction

o   Learning indicators: effort & focus, meeting deadlines, organisation (we use +, = and – for these as they are easily understood by parents/carers, many of whom do not have English as a first language in our setting).


This simplicity of communication means students and their families gain a useful snapshot of their progress at each calendared report.  Follow-up conversations at parents’ evenings and with tutors provide the detail to support with this.


 

What other methods are useful for responding to the data?


·       Mentoring weeks built into tutor time calendar 

This allows tutors to review students’ data with them in 1:1 conversations immediately following the issue of reports.

·       Additional targeted SLT parents’ evenings after each data drop in Y13 

This is something we have introduced in 2024.  Each member of the Senior Leadership Team is allocated two/three students who are significantly underperforming or ‘below target’ across subjects and they arrange a 20-30 minute, in-person conversation after each data drop to set targets with the student.  We hope that including parents and carers in these conversations will have more impact. These are in addition to the annual Year 13 parents' evening.

·       Formal mock results day 

In February, following Year 13 mock exams, we hold a ‘results day’ where students come along to the school library in groups to collect their results envelopes.  They then have a one-to-one conversation with a member of the sixth form team or SLT to review their results and set targets moving forwards.

·       No study leave

This summer we took the decision to remove study leave ahead of the final exams, to optimise students’ time with their specialist subject teachers.  Alongside lessons, we introduced a ‘Revision Wrapped Up’ programme to provide additional support throughout this time.  I wrote about this previously here.

·       Signposting students to support

I wanted to include this point because it is absolutely fundamental.  Without sufficient pastoral and wellbeing support, none of the above methods will lead to optimal student progress – I have written about some of our approaches to this before, in this post which has been enhanced further since, with the addition of support from our local NHS Mental Health Support Team. Effective support and signposting is essential if students are to optimise their outcomes.

 

 

All schools will have different processes and systems for data collection, analysis and response.  Whilst I know we will continue to tweak our systems over time, what I hope the above demonstrates is that by creating systems that are clear, consistent and avoid the duplication of work, school staff are given more time to implement effective responses to the data. 

 

Ultimately, this is not about the data.  Whilst it is great to see improvements on a spreadsheet, what is much more impactful is seeing joyful and proud students and families on results days, excited for their life beyond school, having secured their chosen post-18 destination after achieving optimal outcomes – this, and this alone, should be what drives our approach to data.

376 views0 comments

Recent Posts

See All

Comments


Post: Blog2_Post
bottom of page