Evaluating the effectiveness of professional development
Research 6 May 2019 5 minute readLouise Wignall describes an ACER evaluation of professional development programs for Vocational Education and Training (VET) teachers and assessors.
The Australian Council for Educational Research (ACER) was engaged by the VET Development Centre (VDC) to evaluate its 2018 professional development program for Providers of Accredited Training and Learn Local Organisations.
Funded by the Victorian Department of Education and Training, the VDC program focuses on delivering workforce development training to improve the knowledge, skills and practice of VET teachers, trainers and assessors, and to improve the learning environment, experience and outcomes for all VET learners.
VDC CEO Martin Powell says, ‘The initial findings of the ACER independent evaluation have been invaluable for validating our approach to professional learning as well as enabling us to refine our PD offering.’
The evaluation addressed six key questions. ACER researchers mapped these questions to the Kirkpatrick Model of program evaluation, drawing on the work of Thomas Guskey in applying the model to educational contexts.
Research questions | Kirkpatrick Model level |
---|---|
Who registered and who attended the programs? | |
What were participants’ motivations and reasons for attending? | |
What were participants’ initial reactions to the workshops? | 1. Reaction |
What skills and knowledge did participants acquire? | 2. Learning |
To what extent have skills and knowledge been applied back on the job? | 3. Behaviour |
What are the perceived impacts and benefits of participation in the programs? | 4. Results |
Level 1 reactions from program participants were gathered through feedback forms completed at the end of the professional development sessions delivered from June to October 2018. Four to eight weeks after the session, ACER sent participants a 10-question online survey to gather Level 2-4 reflections and observations. A sample of participants was also selected for pre- and post-attendance phone interviews in order to gather their expectations about what they wanted from a particular PD session, and then to reflect on and draw out their observations about whether that expectation had been achieved.
Based on the evidence gathered, the evaluation has found that the program has made an important contribution to the continuing professional development of the VET workforce. The program has been well-subscribed and well-supported by those who attended. The results of the evaluation show that participants responded favourably to the workshops and webinars.
Participants reported that they were able to learn and/or refresh skills and knowledge that they intended to use to drive change in their organisation. The majority were already in the process of applying what they learnt when contacted to participate in the evaluation.
ACER provided VDC recommendations on how the program could be built upon to ensure it remains effective, efficient and relevant to its various stakeholders. A follow up evaluation is planned to measure VDC’s 2019 program against the baseline evaluation and better understand the medium-longer term impact of participation in the workshops.
The Australian Council for Educational Research (ACER) was engaged by the VET Development Centre (VDC) to evaluate its 2018 professional development program for Providers of Accredited Training and Learn Local Organisations.
Funded by the Victorian Department of Education and Training, the VDC program focuses on delivering workforce development training to improve the knowledge, skills and practice of VET teachers, trainers and assessors, and to improve the learning environment, experience and outcomes for all VET learners.
VDC CEO Martin Powell says, ‘The initial findings of the ACER independent evaluation have been invaluable for validating our approach to professional learning as well as enabling us to refine our PD offering.’
The evaluation addressed six key questions. ACER researchers mapped these questions to the Kirkpatrick Model of program evaluation, drawing on the work of Thomas Guskey in applying the model to educational contexts.
Research questions | Kirkpatrick Model level |
---|---|
Who registered and who attended the programs? | |
What were participants’ motivations and reasons for attending? | |
What were participants’ initial reactions to the workshops? | 1. Reaction |
What skills and knowledge did participants acquire? | 2. Learning |
To what extent have skills and knowledge been applied back on the job? | 3. Behaviour |
What are the perceived impacts and benefits of participation in the programs? | 4. Results |
Level 1 reactions from program participants were gathered through feedback forms completed at the end of the professional development sessions delivered from June to October 2018. Four to eight weeks after the session, ACER sent participants a 10-question online survey to gather Level 2-4 reflections and observations. A sample of participants was also selected for pre- and post-attendance phone interviews in order to gather their expectations about what they wanted from a particular PD session, and then to reflect on and draw out their observations about whether that expectation had been achieved.
Based on the evidence gathered, the evaluation has found that the program has made an important contribution to the continuing professional development of the VET workforce. The program has been well-subscribed and well-supported by those who attended. The results of the evaluation show that participants responded favourably to the workshops and webinars.
Participants reported that they were able to learn and/or refresh skills and knowledge that they intended to use to drive change in their organisation. The majority were already in the process of applying what they learnt when contacted to participate in the evaluation.
ACER provided VDC recommendations on how the program could be built upon to ensure it remains effective, efficient and relevant to its various stakeholders. A follow up evaluation is planned to measure VDC’s 2019 program against the baseline evaluation and better understand the medium-longer term impact of participation in the workshops.