Integrating Accredited CPD Activity Into the Workforce

Integrating Accredited CPD Activity Into the Workforce

Designing Impact:

Integrating accredited CPD activity into the workforce

Paper presented at the International Professional Development Association annual conference: 9th – 10th December, 2005

Hillscourt Conference Centre, Rednal, S. Birmingham, UK

Steven J. Coombs: Bath Spa University

Alison S. Denning: Bath Spa University

Key words: Impact, Postgraduate Professional Development, pedagogical bandwidth

Abstract

This paper outlines the approach that Bath Spa University (BSU) is adopting towards meeting its ‘impact’ funding requirements for the Training and Development Agency’s (TDA) Postgraduate Professional Development (PPD) programme in England. The department for Continuing Professional Development (CPD) at BSU has recently appointed a CPD Quality Assurance (QA) project manager. The remit for this new post involves implementing a wide ranging set of reforms and new policies within the CPD department. These objectives include designing a set of new procedures and instruments for both capturing and building impact into the current range of in-service accredited CPD modules now being funded under the aegis of the PPD. Working alongside BSU tutors as part of a policy to help them rethink the design of such modules within the PPD QA framework is part of a new university professional development programme. In addition to design QA evaluation instruments to annually report impact evidences to the TDA as part of a wider research evidence base, we have also started to run internal module development workshops to identify and relate the TDA’s desired gradations of Impact and Outcome evidence criteria against the OFSTED self-evaluation (SEF) categories required by all state schools in England. Apart from building more professional relevance into the redesign of our PPD-funded CPD in-service modules, this approach meets another key TDA requirement, which is to obtain such evidences from the school system without imposing any additional administrative burdens upon them.

Introduction and Context of CPD in England

A major previous review of the award-bearing in-service scheme by Soulsby & Swain (TDA, 2003) www.teachernet.gov.uk/docbank/index.cfm?id=4129 in England have highlighted differing levels of success with regards to the type and amount of evidence which has been gathered by in-service providers relating to impact on individuals, schools and pupils’ learning. It appears that the impact of in-service (INSET) provision can be demonstrated more readily in terms of impact on Individuals and schools whereby Soulsby & Swain maintain that it is: ‘less clear that award-bearing INSET has a direct and demonstrable effect on pupils’ standards’, page 12.

As a consequence of this report and the demands for obtaining more evidences of the impact of accredited CPD programmes upon learners within schools the TDA restructured the old award-bearing INSET scheme and has launched a new version entitled Postgraduate Professional Development (PPD) that has adopted a new approach in relation to its reporting mechanisms. All providers that are now engaged in the new PPD programme, which replaces the old in-service programme from 1st September 2005, are now required to self-evaluate their programme and report to the Training and Development Agency (TDA) on an annual basis to identify and quantify the impact of the Institution’s PPD provision on all learners, which includes: pupils, teachers, and the wider workforce operating in schools.

It is interesting to note that the English Office for Standards in Education (Ofsted) (2004) has recently reported that CPD/HEI providers tend to only evaluate the impact of participant learners (teachers on INSET courses) but generally not their clients, i.e., pupils and students. In the TDA’s (2005) recent review of ‘Research and Evaluation to Inform the Development of the New Postgraduate Professional Development Programme’ they also identified Ofsted’s major finding:

‘Of the 15 providers inspected in the Ofsted (2004) review, almost all monitor the impact of training on participants but fewer than a quarter make any attempt to evaluate the impact on pupils’ page 14.

In the initial TDA proposals for HEI providers to submit their funding programmes for the new PPD during October 2004 there was no clear definition of the nature and scope of impact as required within the stated criteria. This raised the question as to the exact meaning and nature of what impact really is. To address this problem of the ontological assumptions underpinning the definition of impact the BSU PPD submission made to the TDA contained an explicit declaration of its understanding of impact and how it might relate to its own policy of designing future CPD programmes for the PPD and therefore its evaluation instruments.

The BSU PPD submission document given to the TDA in October 2004 gave an holistic definition of impact and a rationale of the CPD department’s ontological assumptions. The concern that we had was that the original TDA framing of impact led us to believe that the requirements might simply assume a behaviouristic agenda, i.e., simplistic measurement of pupil learning outcomes only without proper reference to the wider social context of teachers operating in schools as learning organisations. This concern was also highlighted by providers in Ofsted’s (2004) report on impact whereby in annex 24 they reported that:

While recognising the importance of monitoring the impact of their courses, several providers raised serious doubts about the feasibility of linking improvements in pupils’ attainment to a teacher’s participation in a course and of separating such effects from other school improvement initiatives’ page 12.

Indeed, in CPD Update (2005) Cliff Jones interviewed the new Chair of the University Council for the Education of Teachers (UCET) CPD committee Kit Field and asked him to clarify the meaning of ‘impact’ beyond the obtaining of better exam and test scores for pupils. The response given was that:

.

‘Impact is multi-layered. Impact can relate to pupils. This may be an improvement in their motivation, behaviour and attitudes as well as attainment. For teachers, participation [in accredited CPD] may lead to greater job satisfaction, better, broader and deeper understanding. For schools it may lead to new forms of provision, improved networks, and improved attendance and attainment figures…. However, the impact of good teaching may take years to recognise. Impact is complex, and we can seek out indicators. We cannot prove, however, that exam results have improved because a teacher attended a course’ page 7.

Within the CPD department at BSU we have decided to define the concept of impact in the following way:

  • That impact means an improvement in learning for the learner.
  • That the learners include all the social players and stakeholders engaged in the learning process.
  • That we have identified the learners as being pupils/students, teachers and their peers, school leaders and the wider workforce including parents and governors.
  • That impact embodies the concept of school improvement within a learning organisation.

It is from these assumptions of the nature of impact that we have started to build a new CPD framework that informs the instructional design of our individual course modules and wider programme referred to as the Professional Master’s Programme (PMP). These assumptions of impact also underpin the design rationale of any PMP evaluation instruments designed to both inform the improvement of our own CPD as well as report useful findings to the TDA.

We also note Robinson and Sebba’s findings (TDA, 2005) for the TDA’s PPD in which they report that most schools and organisations ‘do not have well established processes for evaluating impact but CPD itself can contribute to the development and implementation of evaluation strategies’ page 1. This means that building impact into the very processes and activities of CPD professional learning modules within the PMP at BSU is one of our core objectives for delivering useful PPD for participants and their schools.

As the literature clearly states (Reeves et al, 2003) there are many difficulties surrounding the issues of the specific measurement of the impact of teachers’ CPD on pupils as learners. The collection of data relating to impact on schools may also be difficult as the level of impact of the CPD activity may not be apparent at the end of the programme. Its relevance may only become apparent once the participant initiates and engages in change within their school environment and evaluates the impact of the changes that they have made. This will be suitable for some CPD activities, for example, work-based action enquiries but will not be appropriate for all activities upon their initial completion. Therefore not all CPD activities will have the same types of impact upon all the constituent learners. For example, the more theoretical modules on research methods will tend to provide essential underpinning prior knowledge for other CPD research-led activities that are engaged in the school improvement process, for instance the BSU work-based enquiry. Clearly, an action enquiry conducted without the proper knowledge of a research framework and methodology would lead to less valid research evidences obtained from the project. This raises the question of the need to discriminate between direct and indirect types of impact evidence and to report the integrated effects of impact evidences across a programme’s entire set of modules. That is alright, as the TDA has strategically decided to provide PPD funding for entire programmes that meet its criteria rather than individual modules, which was the case for former funding streams awarded for accredited CPD that required every individual course to be declared and assessed against the required in-service criteria of the time.

In addition to the issues stated above, the TDA now also requires that the evidence collected to assess impact should not result in further burden on the participants’ activities in school, but that it should be produced as a part of the natural process of the CPD activity. This CPD design issue raises a number of questions about how we can integrate the needs requirements of the TDA for impact evidence, whilst also ensuring that the evidence collected does not impose on the day to day workload of the participant engaged in their school.

A further consideration to be held by PPD providers as indicated in Soulsby & Swain (2003) is the fact that many of the teachers who engage in CPD activities wish to do so purely for the vocational benefits that the activity may have on their own professional development and the development of the children that they teach, not necessarily because they wish to receive an academic award. This professional issue and concern was also raised in CPD Update (2005) where the following question was put to the new Chair of the UCET CPD committee: ‘is accredited CPD useful to teachers and others or simply one more burden?’ page 6. The response argued that accreditation provided a professional means of ‘sense making’ that ultimately can contribute to the professional evidence-base of the profession through newsletters and journals. It is clear that the issue of requiring teachers to produce CPD accreditation at Master’s level is off-putting as many teachers still have negative perceptions of academic-related CPD, hence the notion that this type of CPD activity as being a barrier in its own right. What is actually putting teachers off? This issue of ‘barriers to participation in CPD’ was also identified by the TDA (2005) through Robinson and Sebba’s report where teachers reported an overall lack of time to both participate and implement changes stimulated from useful CPD activities. However this report also identified that many teachers still perceive the concept of accredited CPD as being restricted to attendance at academic courses and conferences rather than through engagement in useful work-based projects. This important issue explains why the TDA PPD funding requirements also place emphasis on providers to somehow rethink the design of their CPD programmes so as to reduce these perceived barriers and to consider new methods of delivery and participation so as to improve overall accessibility.

This vocational/academic paradox has in part been addressed by the TDA through encouraging the individual awarding bodies of higher education providers the powers to rethink and redefine academic policies that constitute ‘Completion’ of any CPD activity such as a course module. Indeed, upon further clarification by the TDA of this issue they recently announced the following statement via a circular sent around to all PPD providers:

PPD providers must follow their own awarding body’s definition of what constitutes course completion. The TTA does stress, however, that a completion does not necessarily mean that participants have successfully gained an academic award at the end of the programme.

(TTA ITT Funding Team ref 05/QAF/0810 Annex B pg 3).

On understanding this clarification of completion, coupled also with the other important QA funding requirement of improving and increasing teachers’ access to CPD opportunities within the PPD we have produced a new CPD ‘levels of engagement’ policy document that allows us to encourage all qualified teachers to participate and complete CPD programmes with the aim of achieving useful vocational and/or academic targets. In order to design a credible CPD programme under the auspices of the PPD funding framework it is necessary to take into account and integrate all of the following interpreted quality assurance (QA) objectives:

  • To proactively lever and capture impact evidences of learning from all beneficiary learners.
  • To design new and innovative CPD opportunities that provides greater flexibility and access to accredited modules within our funded PPD programme thereby reducing ‘barriers’ to teacher participation.
  • To redefine and report successful ‘completion’ of participants on our CPD programme that does not necessarily always lead to an academic award but recognises useful professional contributions achieved as a result of participation within the vocationally applied academic frameworks, e.g. through the instigation of CPD action enquiry projects in schools etc.

It is the aim of this paper to identify the mechanisms by which Bath Spa University proposes to both integrate and ‘converge’ the above interpreted QA funding requirements of the TDA’s PPD programme.

Designing CPD Impact Research and Evaluation Instruments

In summary we have identified the key PPD CPD design issues based upon programme requirements that proffer a culture of professional learning:

  • The need to integrate CPD impact evaluation evidence across the normal range of professional tasks that are engaged in by programme participants.
  • To improve the accessibility and flexibility of the CPD programme to all potential participants.
  • To ensure that the programme design represents a flexible range of delivery and assessment mechanisms that are commensurate to enabling meaningful professional learning tasks that personally impact on the quality of teaching.
  • To encourage greater completion of CPD programmes by increasing the pedagogical bandwidth to cover a wider range of acceptable vocational and academic targets.
  • To also encourage greater progression of teachers towards achieving professional learning at M and D level awards for the greater benefit of the profession, i.e., via disseminating action research PhD projects that provide useful new knowledge towards the evidence base of the teaching profession.

An important issue for the design of impact research and evaluation instruments has been the TDA requirement that the gathering of such evidence should not cause any additional burdens on the participants’ workload. As a result of this requirement we have invested university-based CPD development time towards investigating new ways in adapting the instruments that we propose to use to gather our impact evidence. Our concept was to seek a method whereby the data gathering process could provide useful parallel information towards the participant’s need to obtain their own self-evaluation evidence for Ofsted on behalf of their school. Our thinking comes from the previously stated assumption that CPD impact evidences can also be related to the school improvement plan, which also requires similar evidences towards a school’s self-evaluation form (SEF) for Ofsted (2005) (http://www.ofsted.gov.uk/schools/sef.cfm). This logical breakthrough gave us the idea to hold a BSU CPD staff development workshop to explore and connect both sets of criteria, i.e. Ofsted’s SEF categories with our conceptual framework for the various gradations of impact evidences we seek as outcomes to our CPD activities – see Table 1.

A significant amount of work has been invested in researching the various potential instruments for attempting to integrate the two sets of criteria, i.e., the TDA PPD funding requirements and the new school self-evaluation agenda for Ofsted. All School of Education (SoE) staff at BSU were invited to participate in the development of the Impact and Outcomes framework exhibited in Table 1. The initial BSU staff development session was a practical workshop in which about a dozen participants were asked to map the OSTED self-evaluation criteria against the TDA PPD funding criteria. This was in order to compare and contrast those common areas that could both satisfy the TDA impact requirements, but also result in parallel evidence which the individual participants could also feed into their school’s self-evaluation Ofsted form. The argument being that as such impact evidence is required for Ofsted then the collation and analysis of the same evidence for the PPD cannot be regarded as an additional workload burden.