Kevin Yao
Niki Walters
Jamos McAlester
Kanupriya Hehir
December 2017
For further information on this evaluation report please contact:
Dr Katherine Barnes
Evaluation Unit
Department of Industry, Innovation and Science
GPO Box 2013
Canberra ACT 2601
Phone : +61 2 6102 8901
Email:
Department’s website at: www.industry.gov.au/OCE
Further information
This evaluation was conducted by Kevin Yao, Niki Walters, Jamos McAlester and Kanupriya Hehir.
For further information on this evaluation report please contact:
Dr Katherine Barnes
Evaluation Unit
Department of Industry, Innovation and Science
GPO Box 2013
Canberra ACT 2601
Phone : +61 2 6102 8901
Email:
Department’s website at: www.industry.gov.au/OCE
Acknowledgements
We would like to thank all the stakeholders, internal and external, and survey participants who contributed to this evaluation. We would also like to thank Adna Aliskovic, Claire Welsh, Dick Sims and Mathew Horne for their involvement in the interviews.
Disclaimer
The views expressed in this report are those of the author(s) and do not necessarily reflect those of the Australian Government or the Department of Industry, Innovation and Science.
Copyright
© Commonwealth of Australia 2017
This work is copyright. Apart from use under Copyright Act 1968, no part may be reproduced or altered by any process without prior written permission from the Australian Government. Requests and inquiries concerning reproduction and rights should be addressed to
For more information on Office of the Chief Economist research papers please access the Department’s website at: www.industry.gov.au/OCE
Creative Commons Licence
With the exception of the Coat of Arms, this publication is licensed under a Creative Commons Attribution 3.0 Australia Licence.
Creative Commons Attribution 3.0 Australia Licence is a standard form license agreement that allows you to copy, distribute, transmit and adapt this publication provided that you attribute the work. A summary of the licence terms is available from http://creativecommons.org/licenses/by/3.0/au/deed.en The full licence terms are available from http://creativecommons.org/licenses/by/3.0/au/legalcode
The Commonwealth’s preference is that you attribute this publication (and any material sourced from it) using the following wording:
Source: Licensed from the Commonwealth of Australia under a Creative Commons Attribution 3.0 Australia Licence. The Commonwealth of Australia does not necessarily endorse the content of this publication.
Contents
Contents 4
Abbreviations and acronyms 5
Introduction 6
Evaluation Strategy 2015-19 6
Authority to evaluate 6
Evaluation oversight 7
Evaluation scope 7
Methodology 8
Stakeholder interviews 8
Departmental survey 8
Desktop research 8
Discussion of findings and recommendations 9
To what extent are evaluation activities integrated within the department? 9
To what extent are the evaluative efforts of the department fit for purpose? 10
To what extent are evaluations evidence-based? 11
To what extent are evaluations scheduled to provide timely evidence for key decisions? 11
To what extent are the results of evaluations accessible and widely communicated within the department? 13
To what extent are evaluations independent from program delivery and policy areas? 14
Revising the governance of evaluations 15
Relaunching the Evaluation Strategy 17
Conclusion 18
Appendices 19
Abbreviations and acronyms
Abbreviation or acronym / Definition /APS / Australian Public Service
BLADE / Business Longitudinal Analysis Data Environment
department / Department of Industry, Innovation and Science
EB / Executive Board
EL / Executive level
EU / Evaluation Unit
IEB / Insights and Evaluation Branch
KPI / Key Performance Indicator
NPP / New Policy Proposal
OCE / Office of the Chief Economist
PAC / Program Assurance Committee
PGPA / Public Governance and Performance Accountability
PM&C / Department of Prime Minister and Cabinet
PSD / Program Summary Database
RG / Reference Group
RMS / Research Management System
SC / Steering Committee
SEM / Standard Error of the Mean
SES / Senior Executive Service
Unit / Evaluation Unit
Introduction
Evaluation Strategy 2015-19
The Department of Industry, Innovation and Science (the department) established its Evaluation Strategy 2015-19 (the Strategy) in response to the planning, performance and reporting requirements of the Public Governance, Performance and Accountability Act 2013 (PGPA Act). The Strategy provides a framework to guide the consistent, robust and transparent evaluation and performance measurement of programs and policies. It encourages a culture of evaluative thinking, intended to lead to better resource allocation and decision-making, and the evolution of programs.
Authority to evaluate
The Strategy called for a review 12 months post-commencement to determine whether it was meeting the needs of the department and assess the department’s level of evaluation maturity.[1] This review was conducted by the department’s Evaluation Unit.
Key findings:¡ Overall, the department’s level of evaluation maturity is now between ‘developing’ and ‘embedded’.
¡ The evaluation approval process needs streamlining to ensure evaluations are completed faster so that evidence will be available to make key decisions.
¡ Governance roles and responsibilities require clarification. Signoff of reports by the General Manager, Insights and Evaluation Branch or the Chief Economist will improve independence.
¡ Relaunching a revised Evaluation Strategy will provide a timely opportunity to emphasise the importance of evaluative thinking across the department.
¡ The Evaluation Unit needs to explore channels to more broadly disseminate evaluation findings and recommendations.
¡ The Evaluation Unit needs to review and streamline internal templates and processes. /
Evaluation oversight
Oversight of the evaluation was provided by a Steering Committee, whose members are listed below (Table 1).
Table 1: List of the Evaluation Strategy Post-Commencement Review Steering Committee members
Name / Role /Trevor Power, Head of Division, Industry Growth / Chair
Chris Butler, Head of Division, AusIndustry – Business Services / Member
David Turvey, General Manager, Insights and Evaluation Branch / Member
Evaluation scope
This evaluation examined the initial implementation of the Strategy to assess the extent to which the department has developed a culture of evaluative thinking, incorporated evaluation practices into its activities, and advanced its evaluation capabilities. This review utilises departmental input to determine current evaluation maturity levels.
The evaluation was guided by six core principles (Appendix A) established in the Strategy:
1. To what extent are evaluation activities integrated within the department? Is evaluation embedded in core business activities?
2. To what extent are the evaluative efforts of the department fit for purpose? Are evaluation activities proportional to the value, risk and impact of programs?
3. To what extent are evaluations evidence-based? How robust are research and analytical methods used to assess inputs and outcomes?
4. To what extent are evaluations scheduled to provide timely evidence for key decisions?
5. To what extent are the results of evaluations accessible and widely communicated within the department?
6. To what extent are evaluations independent from program delivery and policy areas? How is the independence of the Evaluation Unit currently achieved?
Evaluation sub-questions were based on the Evaluation Maturity Matrix at Appendix A.
Methodology
Stakeholder interviews
A total of 34 semi-structured interviews were conducted as part of this review. Interviewees included Deputy Secretaries, Heads of Division, General Managers, Managers, departmental staff, representatives from Commonwealth Agencies, external consultants and members of the Evaluation Unit. Where possible, interviews were recorded and confirmed by a second evaluator. Interview transcripts were analysed thematically, based on the department’s evaluation principles.
Departmental survey
All departmental employees were invited to participate in an online survey in early 2017. A total of 110 responses were received. This represents an acceptable response rate of approximately four per cent, allowing broad indication of trends. Responses were categorised by:
¡ engagement with the Evaluation Unit’s services
¡ APS- and Executive-level and above staff
¡ whether respondents were from policy, program delivery or corporate areas.
Desktop research
Desktop research covered internal and external evaluation documentation including:
¡ Executive Board briefings and minutes
¡ evaluation guidance material and planning documents
¡ evaluation capacity building literature.
Discussion of findings and recommendations
The Strategy identified the department’s evaluation maturity as being at the second stage of a four-stage process from ‘beginning’ through ‘developing’, ‘embedded’ and ‘leading’. This review identifies that the department is now at a level between ‘developing’ and ‘embedded’ (Figure 1). While we have not achieved ‘embedded’ status overall, we meet one of the two criteria for ‘leading’ for the principle of ‘timeliness’, in that the department’s approach to evaluation and performance planning is seen as an exemplar. We are also close to, or at ‘leading’ in being ‘fit for purpose’ and ‘evidence-based’. In terms of ‘integration’ and ‘transparency’, however, we are still at the ‘developing’ stage.
Figure 1: Graphic depiction of the department’s overall evaluation maturity as progressing from ‘developing’ into ‘embedded’
Source: Department of Industry, Innovation and Science (2017)
The following section discusses the findings of this review and provides recommendations, based on the six evaluation principles outlined in this evaluation’s scope. This is followed by recommendations to changes in the governance of evaluations and proposed revisions to the Evaluation Strategy. The evidence-base for this review can be found in Appendices C (analysis of interview evidence) and D (analysis of survey evidence).
To what extent are evaluation activities integrated within the department?
The department’s evaluation processes are being promoted externally as best practice across the Commonwealth.[2] Several departments are in the process of implementing processes and policies based on, or similar to ours, pointing to a leadership role in the Australian Public Service (APS). It must be acknowledged, however, that other Commonwealth Agencies are similarly developing their capabilities in evaluations.
Appreciation for evaluation across the department is improving, with staff widely reporting that evaluations are an important part of comprehensive program delivery and policy work. As yet, this perception has not extended to a general view of evaluation as central to core business activities and staff do not yet believe it is a part of their daily duties. As a result, evaluation activities are reportedly often under-resourced and given lower priority compared to tasks viewed as ‘core business’.
In spite of an increased appreciation for evaluation, responses from interviews and the survey highlighted that most departmental employees are unaware of the Evaluation Unit. Those who have engaged with the Unit, however, are more likely to report improvement in evaluation skills within the last 18 months. They are also more likely to be aware of the benefits of evaluation and the role of the Unit. Holding of formal qualifications by Evaluation Unit staff is encouraged, with one member holding a Master of Evaluation qualification and three studying for the qualification at the time of the evaluation. The expertise of the Unit is being effectively leveraged in the conduct of Evaluation Ready activities, providing advice on evaluation issues, and conducting evaluations.
In order to effect any real change in organisational evaluation capacity, the purpose of evaluation should be meaningfully integrated into organisational policies and procedures.[3] Thus it is of concern that very few executive-level survey respondents reported using evaluation reports to improve policies, services or programs, indicating that perhaps evaluation is not seen as a tool to inform program and policy design. Approximately 40 per cent of surveyed executives from program delivery and policy areas reported that evaluations are not used to secure additional funding or assess progress.
Recommendations for the Evaluation Unit1. The Evaluation Unit should investigate ways to further embed evaluations in policy development, particularly before drafting new policy proposals (NPPs) or shortly after. The Evaluation Unit should provide guidance material for the Budget Policy Section to distribute to NPP drafters and continue to provide input as part of the Portfolio Budget Statement key performance indicators (KPIs).
2. The Evaluation Unit should communicate the Evaluation Strategy Post-Commencement Review findings and recommendations across the department along with relaunching the revised Evaluation Strategy. This could be at an all-staff presentation to demonstrate the impact of the stakeholder consultations on evaluation processes within the department. /
To what extent are the evaluative efforts of the department fit for purpose?
The triaging of programs according to the tiering system and criteria set out in the Strategy is ensuring that evaluation activity is proportional to tiering level, with priority programs being evaluated according to the Evaluation Plan 2016-20. The tiering method was generally reported as an effective way of prioritising evaluation activities based on need. There is, however, uncertainty about the process of changing a program’s tier and the timing of evaluations.
Evaluations conducted by the Evaluation Unit and more broadly, the Office of the Chief Economist (OCE), are viewed as using robust methodology and are considered fit for purpose to assess impact and outcomes. Program logics are seen favourably across the department and are actively championed by the Senior Executive. However, the later stages of the Evaluation Ready process, such as the development of a data matrix, are not as well understood within policy and program delivery areas. General capability building for the development of data matrices (including evaluation questions, indicators, KPIs and data source identification) within the Evaluation Unit was felt to be necessary.
Recommendations for the Evaluation Unit3. Further clarification of the requirements and expectations of Evaluation Ready activities and evaluations should be established and communicated during engagement with policy and program delivery areas. This includes building capability within the Evaluation Unit and streamlining templates and processes.
4. The steps to change a program’s evaluation timing and tier should be outlined in the revised Evaluation Strategy and on the Evaluate intranet page. The Assurance and Audit Committee should be provided with the Evaluation Plan and notified of subsequent revisions. /
To what extent are evaluations evidence-based?
Staff across the department cited difficulties in collecting suitable data for meaningful analysis. In previous evaluations, this has presented a significant hindrance to assessing outcomes and impact. Widespread improvement in data collection and holdings requires a cultural shift that is expected to take some time to become evident. However, planning data collection at the beginning of program development in collaboration with the Data Management and Analytics Taskforce (DatMAT) is moving the department towards improved data collection and holdings.