Have you evaluated
the impact of your CPD?

Understanding if your CPD is working or not shows you how to improve learning within your institution at every level.

Numerous studies have suggested that as few as 7% to 25% of schools effectively evaluated whether their CPD provision was having an impact on pupils’ learning. 


This is a very real problem as we simply don’t know if what we're doing is working or not. We therefore can’t have any chance of improving by doing more of the things that work well and stopping doing the stuff that is less effective. We also don’t know if we are getting a good return on the time and money we are investing in CPD.

More than a satisfaction survey

When CPD evaluation does take place it is more often than not limited to a ‘happiness / satisfaction’ questionnaire. We should and we can do better than this. We need to know more than if the teacher enjoyed the lunch! 


The ultimate purpose of CPD is to change classroom practice, improve teacher quality and impact on students’ learning. The evidence however suggests that this is currently neglected and under-valued in our schools. Research in this area repeatedly suggests that the impact on teacher quality and students learning is rarely evaluated and ‘if done is rarely executed effectively’. Evaluation really matters but it is clearly the weak link in the CPD chain so what can we do about it? 


Build Evaluation in: Start with the end in mind

Firstly, evaluation needs to be built into CPD from the outset. Think of evaluation as the golden thread that should run all the way through effective CPD. The key to this is to build this into our planning by starting with the end in mind. 


Starting with the end in mind provides both a sense of clarity of purpose but it also creates a natural evaluative focus. Did you have the impact you were aiming to achieve? 

The second important thing we need to do is understand what the research and effective practice in this area tells us. As Dr. Janet Goodall notes ‘schools are generally not skilled in the processes of evaluation and lack experience and tools to consider the impact of CPD.’ This is the challenge. Evaluating the impact of teacher learning on something as complex as pupils learning can be a challenge for schools.


Our Impact Analysis Tool


Supporting schools to achieve this vitally important objective is why our partners LearnEd have developed an inbuilt Impact Analysis Tool to provide an easy and effective way to measure the impact of CPD. 


This approach takes the leading research in this area to create a systematic way for schools to capture this information. The approach is based on the work of Professor Thomas Guskey’s 5 Levels of CPD Impact Evaluation


Level 1: Teacher’s reaction - this is where 90% of evaluation starts and ends

Level 2: Teacher’s learning 

Level 3: Organisational support

Level 4: Teacher’s application of learning

Level 5: Pupils’ learning outcomes


The Impact Analysis Tool captures both quantitative and qualitative data on CPD impact for schools. The quantitative data at each of the 5 levels gives schools an immediate overview of the impact across their school. 


Our system also captures the more refined and deeper qualitative data from teachers allowing schools to develop a more holistic evaluation at the individual level and for teachers to upload supporting evidence. 


Asking the right questions

The key to making all this work is to pose the right questions for adult learners. The right ‘diagnostic’ questions facilitate deep reflection in teachers and can help unlock the information you need. We’ve worked with experts to develop these questions and to embed them into our Impact Analysis processes and tools. 


Three final points


Look for evidence not proof.

Evaluating the impact of teacher learning is complex, evaluating the impact of this on pupils learning and outcomes even more so. There are so many factors that impact a child’s education trying to isolate ‘one variable’ is simply impossible. It is very unlikely that we can ever establish definitive ‘proof’ that x  equals y. It is therefore important to understand that we are not trying to measure in the same way as a scientific (positivist) study looking at a vaccine for example would. Rather we need to look for ‘evidence’ that helps us to understand the wider impact. 


Timing matters. True impact takes time to see.

The impact of practice changes on pupils will take time to see - typically at least 2 terms, maybe longer. The timing of the evaluation process needs to reflect this. How do you know when to complete it? The simple rule is when it is possible to do it. If a teacher can’t answer the questions yet or provide any evidence they should come back to it later. When they can - that’s the time to complete the process. They will know. 


Finally. Not everything needs to be evaluated. 

Trying to evaluate your 1 hour safeguarding update training on the 5 levels won’t work and won't help. Equally, not everything needs fully evaluated or is worth the time and effort to complete an impact evaluation. Again, this is quite intuitive and it should be easy to see if it is worth doing. Look back at the CPD aims - if they aim was to achieve x impact on pupils; it can and should be evaluated.