Like many school leaders I have been exploring a better way of monitoring the impact of INSET. I am convinced of the importance of CPD as a crucial investment in staff even in times of financial stricture. Perhaps especially in those times. That belief, however, does not cut the mustard when it comes to proving that the time, money and other resources invested in training has paid dividends in terms of pupil outcomes.
Long-term investment in CPD
Over the last few years we have shifted the balance from ‘away day’ courses to long-term training. We support colleagues through academic qualifications such as Masters degrees and professional ones such as the MLDP. This demonstrates commitment to the long-term development of colleagues, provides tangible benefits to the school and sits well within our commitment to being a community of lifelong learners.
This type of professional development is easy to evaluate. The colleague gains a recognised qualification and the action research element is always key area of the SIP, contributing for clear outcomes for pupils.
There is still a place for the INSET day. There is training that we all need to renew, such as safeguarding as well as updates on the national and local agenda which affect teachers and pupils. We also use the time for colleagues to share good practice and teaching tools they have developed. I have found it harder to evaluate the impact of this training. For many years I used staff evaluations, having colleagues rate sessions on an evaluation form. The trouble with this kind of customer satisfaction survey is that everyone may have a jolly good time, but will that have a positive impact on the experience of pupils. It also seems to be that the colleagues who are less satisfied always seem to be ones who don’t tend to fill in the ‘Even better if…’ part.
Consequently I, and those in charge of training at other schools within Oxford City Learning (a partnership of secondary schools) have become increasingly sceptical of the ‘Trip Advisor’ approach.
In my evaluation of our September INSET, I carried out a correlational analysis of ratings for the helpfulness of different sessions by staff members against their key learning points, and helpfulness vs. intended actions. In neither case did I find much in the way of a correlation:
Helpfulness vs. Learning, r = 0.2576Helpfulness vs. Intentions, r = -0.1832
Neither result was statistically significant.
The commentary from staff on learning is more useful that their ratings of helpfulness because it allows me to identify whether the intended impact of the training was achieved. By and large, this seems to have happened. Most staff commented on strategies with the groups of pupils we were focussing on: disadvantaged students, those with particular disabilities and special needs, and those who need to make rapid progress with their literacy. Learning comments also indicate that some staff made links between the separate elements of the INSET: Our school value of ‘Justice’ working through a consideration of developmental needs from ages 2 – 19 (we’re an all-through school with nursery, primary and secondary phases), to differentiating to meet the needs of particular groups of students. On the other hand some staff did not make this connection and a few questioned the relevance of some aspects of the training. This shows me that, while there is always a balance to be struck, perhaps particularly at the start of term, we need to do more to give a holistic overview prior in the introduction to training
Evaluation across the year
It’s the follow-through on the intended actions that will be the key to evaluating the impact on pupil outcomes. Further INSET will pick up on strategies to improve progress by the groups of pupils mentioned above. The session that had the highest combined rating for helpfulness, key learning and intended actions (on extended writing across the curriculum) featured a combination of research evidence, contributions from teachers from different curriculum area, demonstrations of improved pupil work and a resource pack for all teachers. This shows that teacher-led examples of successful practice, backed by resources to support their colleagues, are a winning combination. Further INSET during the year will be based on this model with differentiated choice so training is personalised.
It is actual, rather than intended, action that makes the difference, of course. We will use our usual evidence-gathering systems (learning walks, marking drops, student voice, etc) to gauge the impact of teaching actions stemming from INSET. CPD is also picked up in Performance Development (we don’t use that vile term ‘Appraisal’ urgh), with all teachers having an objective around the progress of disadvantaged pupils this year, and being able to shape a personalised CPD objective.
I’ll update this post later in the year when more of our intentions have been implemented as actions.
Helpful comments are always welcome. I’d also like to hear more about evaluation of CPD in other schools.