Evaluation Capacity Building

A Strategy

for

Catholic Schools in South Australia

Presented to Brad Shrimpton

For 460 - 625 Project

Presented by Paul Sharkey

Date November 2009

Table of Contents

1. Introduction 4

1.1. Purpose of the Discussion Paper 4

1.2. Definition of Evaluation Capacity Building 4

1.3. Purpose of the CESA ECB strategy 5

1.4. Structure of the Discussion Paper 6

2. Key environmental considerations 6

2.1. Organisational Structure, Leadership and Culture 7

2.1.1. Organisation’s Structure and Resources 7

2.1.2. Culture and Leadership 8

2.1.3. CESA Strategic Plan 8

2.1.4. Benchmarking and Evaluation 9

2.2. The Catholic Church and its Schools 10

2.2.1. Evaluation politics 10

2.2.2. Changing times in the Church 11

2.2.3. Relationship between School and Parish 13

2.2.4. Outreach Competencies 16

3. Evaluation Capacity Building 16

3.1. Expectations, Motivations and Assumptions 16

3.1.1. The benefits of Evaluation Capacity Building 17

3.1.2. Criticisms of the ECB strategy 18

3.1.3. An ECB strategy that avoids the pitfalls 19

3.2. Capacity Building for a Range of Evaluation Types 20

3.2.1. Proactive Evaluation 21

3.2.2. Participatory Evaluation – Including the Participation of Students 21

3.2.3. Empowerment Evaluation 22

3.2.4. Developmental Evaluation 23

3.2.5. Impact Evaluation 23

3.2.6. Implementation Evaluation 23

3.2.7. Monitoring Evaluation 23

3.3. Design considerations 24

3.3.1. General Design Issues 24

3.3.2. ECB Outcomes 25

3.3.3. ECB Teaching and Learning Strategies 26

3.4. Implementing the ECB Initiative 28

3.5. Sustainable Evaluation Practice 29

4. Conclusion 31

5. Appendix A: Family Centred Church Handout 32

6. Appendix B: A resource for site leaders 35

7. References 38

Paul Sharkey [Student ID 198620233] → Negotiated Project 460-625: August 2009 3

This report is confidential, having been written for the purpose of assessment in a subject delivered by the University of Melbourne: 460-625 Negotiated Project. The only persons entitled to have this report are those associated with the assessment of 460-625. Any wider publication of it, either in part or in whole, may place the publisher in breach of privacy, privilege or copyright.

1.  Introduction

Leaders of Catholic Education in South Australia (CESA) are increasingly committing themselves to evidence based program and policy development practices. There is a growing sense that we need to understand our work as the challenge of continuous self-improvement on the basis of sound evaluation practice. In this context, we need to develop a deeper understanding of evaluation concepts and practices and we need to take steps to ensure that a culture of evaluation permeates ‘the way we do things around here’. We need a strategy that helps build a capacity for evaluation throughout the levels of our organisation: at the levels of the individual, the school and the system of schools. This evaluation capacity building strategy is named in this paper as the CESA ECB Strategy – the Catholic Education in South Australia evaluation capacity building strategy.

Catholic Education in South Australia is a diverse community of Catholic schools spread across the State of South Australia. Our system of schools has 5,000 staff, 42,000 students in 102 schools across the State and we are the largest employer outside of government agencies in South Australia. Eighteen governing authorities have jurisdiction over the system of schools. The largest of these is the Archdiocese of Adelaide which has 73 schools and the diocese of Port Pirie which has 13. The prospect of initiating an evaluation capacity building (ECB) strategy in this complex and diverse system of schools is the subject of this Discussion Paper. Issues associated with undertaking that strategy are discussed with a view towards developing the strategy in 2010.

1.1.  Purpose of the Discussion Paper

The purpose of the paper is to discuss some of the issues that need to be addressed if the evaluation capacity building strategy was to be undertaken in our system of schools.

1.2.  Definition of Evaluation Capacity Building

Compton and Baizerman (2007) noted after a recent think tank on Evaluation Capacity Building (ECB) that no two definitions of ECB were the same in the various presentations that were made during the think tank. ECB is at a relatively early stage of development, with a variety of approaches and perspectives evident in the practice of its exponents. Whilst some ECB practitioners are untroubled by the fact that approaches to ECB are still evolving others seek ‘explicit rules of intension and extension’ so that what is included and excluded from ECB might be made more explicit (Compton and Baizerman, 2007).

The ECB tent is still big, it is still open, and there is still room to conceptualize, implement, assess, analyze, and write. When is that time over? We don’t know; that’s an empirical question (Compton and Baizerman, 2007).

Whilst some writers are seeking a clearer operational definition for ECB, others argue that practitioners are best served by avoiding definitional rigidity and are better served by embracing a ‘situational responsiveness’ where ‘different geographical, cultural, and organizational realities’ are able to be accommodated and notions of ECB able to flex accordingly (Taut, 2007b). The tension between the need for operational clarity and situational responsiveness is accepted in this Discussion Paper and as ECB develops over time, no doubt a consensus will emerge regarding fruitful approaches and some of the cul-de-sacs in the field. The following definition of evaluation capacity building is well known and accepted (Preskill & Boyle, 2008) and is used as a working definition for ECB in this Discussion Paper because it is readily applicable to the ECB strategy that we are looking to develop.

ECB is the intentional work to continuously create and sustain overall organizational processes that make quality evaluation and its uses routine (Stockdill, Baizerman & Compton, 2002).

A model developed by Preskill and Boyle (2008) has been used as a basis for conceptualising the ECB strategy being envisaged for implementation in Catholic Education in South Australia (CESA). The model is reproduced in Figure 1 and the sections of this Discussion Paper have been organised around its components.

1.3.  Purpose of the CESA ECB strategy

The ECB strategy aims to strengthen evaluation capabilities within Catholic Education in South Australia. Evaluation capabilities are not an end in their own right; they are being developed because they provide a means for educators to develop policies and programs that are based on the evidence about what students need and what works for them in terms of their formation and learning. The purpose of the CESA ECB Strategy is to develop evaluation capabilities at all levels within the system: at the level of individual educators and at the school and system levels.

Figure 1: Evaluation Capacity Building Model - Preskill and Boyle (2008)

1.4.  Structure of the Discussion Paper

The evaluation capacity building model reproduced in Figure 1 provides a basis for organising this Discussion Paper. The outer rectangle of the model recognises that ECB strategies are not implemented in a vacuum, they are implemented in a particular environment and it is this environment that shapes what is possible as the strategy unfolds. The first issue considered in the environmental scan (Section 2) is the organisation’s structure, its leadership and its learning capacity. Particular environmental factors such as a system-wide strategic plan and accountability requirements of government are also considered in this section. Part 2 of Section 2 considers a major environmental issue for us at this time which is the relationship between the Catholic Church and its schools. The Catholic Church is currently undergoing a process of radical and profound change and sound evaluation practices will be invaluable as Catholic Education moves forward with the Church to establish polices and practices which are based on the real needs of those who experience our care. It perhaps is not surprising that I have chosen to include a sub-section on ‘Evaluation Politics’ at this point of the paper because the debates within the Church and within the education sector are subject to powerful forces at this time.

Section 3 of the Discussion Paper moves from the outside rectangle of our model in Figure 1 to circle on the left hand side. Here, the motivations, assumptions and expectations that people have in relation to evaluation capacity building processes are considered under the following headings: the benefits of ECB, criticisms of ECB, and an ECB strategy that avoids the pitfalls. There are many different types of evaluation and an ECB strategy needs to consider which forms of evaluation it is going to build a capacity for. These forms are considered briefly in Secition 3.2. A number of design considerations for the ECB strategy are then considered in Section 3.3. Following Hanwright and Makinson (2008), the ECB strategy is viewed through a change management lens and issues such as complexity and compatibility are discussed. The discussion on ECB design considers the range of ECB outcomes that one finds in the literature, ECB teaching and learning strategies for ECB, implementation issues and developing sustainable evaluation practice as the ultimate goal of the ECB strategy.

The paper concludes in Section 4 with a brief snapshot of ECB in concrete practice. Sample ECB resource papers have been developed and included in Appendix A to give a feel for what the CESA ECB Strategy might look like in the field.

2.  Key environmental considerations

The evaluation capacity building project can be framed as an exercise in change management – a change from our current attitudes towards evaluation to a more sophisticated use of evaluation as a basis for continuous self-improvement on the basis of sound evaluation practice. The challenges associated with cultural change such as this ought not be underestimated. Schools and school systems have been described as having ‘complex behavioural ecologies in which well-established behavioural patterns have typically achieved some state of equilibrium and are, as a result, resistant to change’ (Noell & Gansle, 2009). The culture which holds this equilibrium together and the environment in which it is situated need careful attention before the strategy is drafted. Significant elements of this environment are discussed briefly here: the structure and culture of Catholic Education in South Australia, leadership and governance, learning capacity and the benchmarking and accountability that has become so prominent in education in recent years.

The CESA ECB strategy is not only being undertaken in a school setting; it is being undertaken in the context of a Catholic school and so a key environmental consideration is the Catholic Church and its expectations regarding the identity and mission of its schools. Catholic schools only exist because they provide a means for the Church to educate students in a Catholic environment. A key feature of evaluation undertaken in a Catholic school is the understanding of the mission of the school and what success looks like in the context of that mission. Section 2.2 considers some dynamics currently at play in the Church and their consequences for the ECB strategy being initiated.

2.1.  Organisational Structure, Leadership and Culture

As indicated above, schools are understood in this paper as being ‘complex behavioural ecologies’ which are held together in a state of equilibrium and are, as a result, resistant to change (Noell & Gansle, 2009). The ECB strategy needs to be cognisant of a variety of environmental factors and structures which provide the context for these behavioural ecologies and constrain and enable what is possible within them.

2.1.1.  Organisation’s Structure and Resources

Catholic Education in South Australia is a community of 102 schools spread across the State, most of which are governed by the bishops and some of which are governed by religious orders. The Catholic Education Office is a diocesan agency which provides a variety of services to the schools and the people who work in the Catholic Education Office provide a potential resource for undertaking the ECB strategy at both the school and system levels. The strategy cannot however be mandated across the system because the various governing authorities play a legitimate role in determining what occurs in their schools and it is not possible for any one authority to unilaterally implement a strategy across every school. One of challenges of designing and executing an ECB strategy across the State is to determine the level of compulsion that is appropriate and possible, given the governing structures that operate across the schools. Ascertaining the authorities that need to approve or sponsor aspects of an ECB strategy can be a challenging task (Noell and Gansle, 2008).

As has been mentioned, consultants who work in the Catholic Education Office provide a potential resource for promoting the ECB strategy within schools but there is a limit to what consultants might achieve given that they are neither professional evaluators, nor members of staff contributing to professional practice in an ongoing way at any given school. Consultants have developed expertise in a variety of areas but not necessarily as evaluators and so would need to be equipped for that role if they were to take it up. Professional evaluators will play an important role in building the capacity of system and school leaders during the CESA ECB strategy and some options are considered in Section 3.1 below.

Sharp (2005) has identified a number of levels or organisational evaluation maturity and Catholic Education in South Australia is identified in this paper as being at the lowest of them in some dimensions of its operation with other dimensions characterised as being at the higher levels. At the lowest level, evaluation is undertaken in an uncoordinated, ad hoc fashion, often in superficial way to comply with measures imposed by those to whom they are accountable. The second level has a more coordinated and wide-scale approach to evaluation but the motivation is still one of compliance, rather than being motivated by a commitment to continuous improvement. At the third level, evaluation is integrated into all aspects of the organisation’s strategy and management. A more sophisticated and systematic approach is evident at level 4 where an evidence base is understood to be an integral and essential element of the development of program and policy. By Level 5, the final level of the hierarchy, evaluation has become an essential element of the organisation’s culture and there is a genuine commitment to continuous self-improvement on the basis of evaluation findings.