ResearchEd: Turning School Improvement and SEF into a research project to develop credible evidence

In the last 8 years, I have had the privilege to work with the Headteachers and Senior Leaders of a number of Primary and Secondary schools, helping them wrestle with the SEF and development planning process. I think that of all the visits I make to schools, the discussions in this area are the most interesting and demanding of any I ever have.
i think its helpful to think on the principle that a “completed packet of work”, which is a completed action or activity in a development plan should only ever be handled once and curated in such a manner that it makes its presence felt at the point of need.

The other principle is that a packet of work has the following constituents:

  1. A specified goal.
  2. Specified work leading to that goal.
  3. A specification of the evidence that will be collected to validate the impact of the action.

Handling packets of work once is achieved by meta-tagging the improvement plan with tags such as SEF question, person responsible for delivery and area of improvement in the school. This concept was developed by Geoff and Mal Broadbent, then of Skills Factory, now of 2Eskimos in the product the developed called schoolcentre.

The element of this process fascinating me most at the moment is the concept of developing each of those work packets into what is effectively a small scale research project. I am avoiding the concept of “actions” which always seem to be to be a euphemism for “outbursts of busyness leading to nothing measurable benefit”.

A work packet is more than an outburst of busyness. It requires:

  1. Development of an hypothesis.
  2. Consideration of the most appropriate methodology to gather useful data to verify the improvement.
  3. Consideration of ethics with regard to publication or causing general misery to children.
  4. Reporting on the impact of the work undertaken.

The key to this seems to me to be in the redefinition of the concept of success criteria, which are often a splendid exercise in the restatement of the planned work or not defining the best outcome of the work at all and in fact specifying some global school improvement target of the percentage of children to progress to a certain level.

If the specification of success criteria is replaced by the specification of a plan to gather evidence of success, then often the work packet is redefined to something more tangible and measurable and the leader of the work packet is more focused on benchmarking the situation at the beginning, and monitoring evidence of improvement during the process. This has had a dramatic impact on the thinking of the leadership teams I have discussed it with.

The next logical step is to explore how to gather data to verify the anticipate improvement. The key here is to explore more than just pupil progress data, given that small scale innovations such as precision teaching (see diatribe at the bottom) are unlikely to impact on pupil progress data measured in sub levels or points of progress. Small scale innovations will probably aggregate to impact on those levels eventually, but a more subtle way of establishing the impact of the innovation needs to be developed.

For example:

If a school is implementing some CPD on assessment for learning strategies in maths, as part of a strategy to improve the quality of teaching, then the CPD itself is not going to have a distinctly measurable impact upon standards on its own, but it is going to have an impact upon the disposition of teachers to try the new processes and their confidence in their teaching. So the trick is to find a way of measuring confidence of the teachers at the beginning and end of the process.

Which probably means beginning the CPD with some intelligence gathering questions and not necessarily in the form of a paper survey:

Give people a post it note as they arrive and ask them to write a word associated to their confidence (scared, happy, dreading it) on the sticky side and stick it on the wall. Then do the same at the end on a different piece of wall. Once the teachers have left you can analyse the change in language and therefore the change in disposition. The data can then inform the next phase of the CPD. If you are particularly devious you can “randomly’ give out colours of post it notes to particular groups of teachers to gain insight into their specific disposition, highlighting key stages etc. [The teachers will be so delighted with the sticker and the big pen they won’t notice the colour coding.]

This approach can also enable you to derive quick data on the attitudes of children to their learning and an insight into the impact the CPD had upon their confidence and disposition, which should lead to early indication of progress towards the holy grail of raised standards.

Diatribe:

I heard today of a snake oil salesman pedalling precision teaching in a school – it turns out that precision teaching is in fact precision testing and seems to suggest that if you repeatedly test the little buggers until they crack, they will in fact learn something.

2 thoughts on “ResearchEd: Turning School Improvement and SEF into a research project to develop credible evidence

  1. Pingback: ResearchED: What works? Where do you start? or Good Practice, the enemy of Research | The Grinch Manifesto

  2. Pingback: ResearchEd 2013: My reflections | Eaglestone's Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s