Collecting quality data to demonstrate progression.

4 min read

From 2021 every Department of Education school in New South Wales has been required to develop and upload a Strategic Improvement Plan (SIP) to their school website. The SIP document outlines each school’s improvement targets alongside three strategic areas which they will focus on between 2021-2024. Within the SIP a school states the metrics they will use to record their development and improvement in each strategic area. Once completed the SIP document is review by the overseeing Director of each local area and signed off. Schools are expected to refer to their SIP’s through each school year to support them in meeting their expected outcomes.

Being the first year this process has been completed state-wide, Eds. completed a review of 20 SIPs from one local government area (sample A) to look for quality, consistency and feasibility. Specifically, we examined whether the metrics chosen would allow Executive staff to measure progression within each strategic area. Sample A was a collection of secondary, primary and schools for specific purposes from a metro area of Sydney with enrolments ranging from 30 full time students to over 1000.

As part of our analysis, we looked to understand:

  • Number of metrics to be used as evidence overall per school
  • Number of metrics to be used as evidence for each strategic direction
  • Percentage of Qualitative vs quantitative data collection

By selecting these three focus areas we were able to examine whether schools could feasibly evidence progression using quality data.

Our findings:

On average each school had cited 26 data sources to be used as evidence. This means that between the period of 2021-2024 each school aims to collect 26 data sets, analysis their findings, possibly visualise their outcomes and use this evidence to continue or amend teaching programs to promote progression. In reality is this feasible? Possibly, however there are several factors that could impact this conclusion. One maybe the disruption COVID-19 has had on education to date. When will students, teachers and parents be in a stable position to reflect on their learning and provide viable survey responses, work samples and reflection? Another is how does a school keep track of 26 data sets and do they have the resources or staffing to do this? At present the teaching workforce is under enormous pressure to deliver the fundamentals of education. If all of the able variables were removed and collecting 26 data sources was possible, the next question would be is it sustainable to do this throughout a school year or would this be split across 3 years. Ideally, we would want to see a pattern of improvement longitudinally however 8 or 9 data sources a year would be more manageable to collect.   

Within the SIP format schools are asked to assign data sets to each strategic direction. We again explored this break down, questioning if there may be any similarities or differences in the volume of data collected.  Our findings showed that schools intended to collected slightly more data to support their strategic direction 1 outcomes (average 10 data sets) whilst strategic direction 2 and 3 had the same volume (average 8 data sets). However, these findings varied between schools with some schools collecting up to 14 data sets for strategic direction 1, whilst others aided lower with 7. Ultimately while the volume of data collected is important, the biggest factor as to whether it could be affectively collected and used comes down to the time and resources it takes to gather.  To better understand this, we analysed the percentage of qualitative vs quantitative data schools planned to collect. Research shows that whilst qualitative data often leads to more informed outcomes it can take longer to collect and analyse.  Because of this we return to the argument do schools have the time and resources to complete this task? Across the 20 schools sampled, 35% of the data collected will be qualitative, whilst the other 65% will be quantitative. Interestingly, on average schools planned to collect more qualitative data when evidencing the outcomes of strategic direction 2.  Across the 20 schools’ strategic direction 2 was most commonly linked to improving teaching quality or improving wellbeing.


Conclusion

From these findings its could be argued that data analysis across the 20 schools can only be conducted at a surface level due to the proposed number of sources to be collected. Eds. would suggest focusing on 4 or 5 per strategic direction to ensure in depth analysis that allows for trends to be followed and outcomes to be accurately measured. This suggestion would align closer to corporate practices from which the methodology of key performance indicators and performance plans are taken.

By measuring 4 or 5 outputs per strategic direction schools could:

  • Accurately measure their outputs regularly and analysis real-time data
  • Make informed decisions using clear data sources
  • Educate and train staff in data practices
  • Design specific data collection solutions that were relevant to their settings and communities

The Eds. Team would like to leave you with this idea – if we expect schools to run as businesses and produce outputs similar to those of businesses should we not provide better training and resources to ensure they can produce the outputs expected of them by government and taxpayers?

Share on facebook
Share on twitter
Share on linkedin