Developing a stakeholder approach to the evaluation of doctoral training

UFHRD Research Honorarium – end of project report

Doctoral students are vital for developing research capability in universities. The movement of trained researchers into the labour market is also important for the development of innovative responses to global competitive challenges (RCUK, 2006; Sambrook and Stewart, 2008; EkzkowitzLeydesdorff, 1997; Thune, 2010; Malfroy, 2011). In recent years awareness of the importance of doctoral students has focused attention on the training and support they receive but frameworks for the evaluation of doctoral education are currently under-developed. The project objectives were:

  1. To critically assess the current approach to evaluation of doctoral education in UK
  2. To propose a revised framework
  3. To implement the proposed framework in one HEI and explore the challenges involved and the consequences for stakeholder engagement with the process.

This is a scholar – practitioner report which illuminates the interactivity of the scholarly –practice process. The author fulfilled a range of roles in carrying out this project including: course designer; tutor; scholar, evaluator, researcher and manager (Anderson, 2012b).Undertaking this scholarly practitioner project challenged many prevailing assumptions about the nature of a rational-linear relationship between knowledge construction, dissemination and utilisation of new knowledge into practice (Nutley et al, 2007; Glasziou & Haynes, 2005; Starkey & Madan, 2001).

When the honorarium was awarded some progress had been made with objectives one and two (Anderson and Gilmore, 2012). This report briefly addresses the first two objectives to provide a background and context before considering in more detail the progress against the third objective.

  1. Assessment of current approach to evaluation of doctoral education in UK

The UK ‘standards’ for doctoral education are set out in the Researcher Development Framework (RDF) ( developed by Vitae, anorganisation supported by the UK Research Councils to promote and develop Researcher Development processes. The RDF articulates extensive lists of knowledge, behaviours and attributes associated with research which doctoral students are expected to develop (Vitae, 2012a). Vitae has also developed an evaluation (‘Impact Framework’)approach (IF) (Vitae, 2012b) which is derived from the ‘levels of evaluation’ model devised by Kirkpatrick and Kirkpatrick (2006) and Kearns (2005). In our review of this approach we identified five conceptual and practical problems with the current IF approach:

a)The ‘one size fits all’ approach is unwieldy and time-consuming for evaluators. Measurement at the ‘higher levels’ of the framework is problematic (Guerci and Vinante, 2011; Nickols, 2005) and fails to highlight factors underpinning the success (or failure) of aspects of doctoral education provision.

b)The assumption of a direct attribution of cause and effect is conceptually problematic (Sugrueet al, 2005; Deem and Brehony, 2005). Research activities and outcomes result from the collective effect of a range of interrelated factors including: the research environment, quality of supervision, ability, and motivational factors.

c)The ’backwards’ time orientation where evaluators set out to ‘prove’ outcomes and value (Russ-Eft and Preskill, 2005) overlooks opportunities for forward-looking assessments to ‘improve’ learning and adapt provision to engender and support the development of professional, reflective, academic practice by doctoral students.

d)The focus on formal taught programmes overlooks the flexible, ‘informal’ and non-classroom based learning and development processes of doctoral students grounded in regular supervision processes and supplemented through activities such as mentoring, work-placements and informal learning opportunities (Boud and Lee, 2005; Malfroy, 2011).

e)The separation between ‘evaluator’ and the ‘subject’ of the evaluation fosters an unwelcome distance between those involved in measurement and evaluation and other stakeholders involved in doctoral learning (Jayanti, 2011; Edelenbos and van Buuren, 2005; Guerci and Vinante, 2011).

  1. Proposed revised framework

In place of the linear and hierarchical model of evaluation underpinned by the Kirkpatrick (2005) approach an alternative model was proposed. The proposed model set out to be more inclusive, multi-dimensional and pluralistic, grounded in different stakeholder perspectives (Donaldson and Preston, 1995; Freeman, 1984; Nickols, 2005) which recognised that learning and evaluation are emergent and result from dynamic interaction between different interested parties or stakeholders. The proposed framework developed at the start of this project is illustrated in Figure 1and is grounded in fourprinciples:

a)A recognition that student learning, at doctoral as at any other level, is an unpredictable process that may lead to unintended, often tacit, outcomes which may be at least as valuable as explicit expectations of behaviours, skills or knowledge.

b)Afocus on longitudinal, aggregated institutional data in place of the ‘event by event’ evaluation approach of the IF.

c)Use of existing institutional data sources where possible to avoid the time consuming issues associated with the IF

d)Analytical integration of qualitative information with quantitative ‘metrics’.

Figure 1: Proposed evaluation framework

  1. Implementation of the framework – challenges identified

The pilot process to implement this framework in one HEI that was undertaken with the support of the UFHRD honorariumhighlighted a number of important issues.

3.1Time and data constraints

Implementing the framework revealed how the HEI site of the pilot (in common with most HEIs in UK) has extensive data sets related to the student experience but very little data are available relevant to other stakeholder perspectives. Therefore, in piloting this approach it became necessary to generate ‘new’ data in an appropriate form to underpin the establishment of efficiency and effectiveness measures as well as indicators of stakeholder engagement. Although a ‘stream-lined’ approach to data collection was a fundamental principle of the proposed new approach (principle ‘c’ above) obtaining the range of data types that were necessary proved to be very time-consuming. This is demonstrated by the ‘month-by-month’ schedule of activity (see Table 1) which was initially identified and which highlights challenges of the requirement for continuous activity and an acceptance of the perpetually provisional nature of analytic outcomes that would result from the proposed framework.

Month / Framework ref / Collate and analyse / Resource requirement
April / Stakeholder / PRES data / 5 days
April / E & E / Workshop feedback forms / Half a day
May / Stakeholder / Supervisor survey / 5-10 days
June / Strategic contribution / Faculties consultation / 5 x Faculty meetings + admin time to arrange
June / Stakeholder / Tutor consultation / One meeting + arrangements = 2 days
July / E & E / Workshop feedback forms / Half a day
August / Strategic contribution / Review UoP Research strategy KPIs and GSDP implications / 1 day (but subsequent measurement implications)
September / E & E / QMD data on throughput/ days per student etc / QMD – year 1 establish base data for comparison year on year and establish reporting system
September / E & E /
Strategic comparison / Establish benchmarking club and measures / Vitae conference and follow-up
2 days for conference + 5 days for follow-up
October / E & E / Major Review progression and reports / Half a day
November / E & E / ASQRs / Half a day
December / Stakeholder / Tutor consultation / One meeting + arrangements = 2 days
January / E & E / Workshop feedback forms / Half a day
March / E & E / Major Review progression and reports / Half day

Table 1: Data collection and analysis schedule

3.2Stakeholder engagement

Nutley et al (2007) highlight how different stakeholder groups have very different expectations of measures and metrics and this proved to be the case for this project. Piloting the approach indicated how the motives of key individuals within the pilot site for this project depended on their background and position. Powers (1997) argues that decision makers often ritualise the use of numerical indicators as a way of making complex issues and processes look manageable. This pilot study indicated that some stakeholders perceived evaluation measures as a basis to support a case for strategic change; others sought to use existing institutional metrics as ’discrete, unambiguous facts’ to support the status quo. Thisproject illuminated the multiple and often ambiguous purposes of evaluation when understood from the perspectives of different stakeholders who were differently concerned to: ‘prove’ that all is going well; demonstrateinstitutional ‘value’; ‘improve’ student satisfaction’ or ‘enhance’ service levels.

3.3Dissemination and communication of evaluation outcomes

The project also highlighted how easy it is for evaluators to become so absorbed and ‘busy’ with the data gathering process that issues associated with dissemination are overlooked. The challenges of communicating the outcomes of evaluation processesin relevant and timely ways that would be most meaningful to different stakeholder groups receive scant attention in the evaluation literature. This raises interesting questions about the ‘ownership’ of evaluation. Traditional approaches to evaluation focus on the ‘mental models’ of trainers, providing data that are more meaningful to this group than to other stakeholders (Anderson, 2007). The ‘blurred’ distinction between development processes and evaluation on which this revised framework was predicated presented challenges of dissemination (both in terms of form and timing).

  1. Implementation of the framework – achievements

This section of the report reflects on the outcomes of the implementation process in each of the three elements of the piloted approach (illustrated in Figure 1).

4.1Stakeholder engagement

The implementation of this evaluation approach enabled productive, ‘two-way’ engagement to be established in relation to the development of doctoral education provision with a range of stakeholders. As part of the implementation process six-monthly progress review meetings were organised that enabled over fifty academic and other tutors involved in delivering doctoral education to meet together and discuss issues and opportunities to enhance provision both in detail (in terms of individual components of the doctoral education provision) and in general. Newly registering research students were also encouraged to become active and vocal in providing feedback in relation to the development of the programme and processes. The engagement of newly registered research students also led to progress with the engagement of supervisors in personal development planning processes and support for students’ continuing development; something that had been difficult to achieve in the past. However, this remains one of the greatest challenges and engagement data (see Table 2) indicate that students and supervisors who had been registered before the project commenced remain less engaged.

Category of user / June 2012
(n) / June
2012
(% of category) / End May 2013 (n) / End May 2013 (% of category)
FT PhD Students / 170 / 55 / 274 / 64
Prof Doc Students / 16 / 9 / 60 / 28
PT PhD students / 98 / 39 / 145 / 46
M Res students / 1 / 92
MSc students / 5 / 100
First Supervisor / 55 / 19 / 109 / 35
Other Supervisors / 51 / 13 / 111 / 25

Table 2: Uptake of Personal Development Planning and other Doctoral Education processes using the Graduate School system

(Research students n = approx 700; Supervisors n = approx 725)

Figure 2 also indicates the differing levels of engagement of students from within each of the five faculties of the University with the face-to-face workshop opportunities that form part of the doctoral education provision.

Figure 2: Proportion of research students attending doctoral education workshops

(n = 700)

In addition, whilst a productive engagement by senior institutional figures with doctoral education has emerged as a result of this project the involvement of many of those at middle management level within faculties etc has remained more ‘passive’ than ‘active’.

4.2Efficiency and Effectiveness Measures

In developing this feature of the evaluation process an ‘away-day’ was organised to work with stakeholders at faculty and institutional levels in order to identify and develop a range of Performance Indicators that would appropriately assess doctoral education effectiveness. The work to develop these indicators highlighted a lack of data in key areas, for example, a lack of aggregated data relating with session satisfaction but other outcome indicators are shown as Table 3.

Key Performance Indicator / 2011-12 / 2012-13 / Target
Graduate School Induction attendance (% of total) / 70% / 78% / > 80%
PGRs registered on Skills Forge (% of total) / 48% / 68% / > 80%
PGR students attending sessions (% of total) / 32% / 58% / > 50%
Session satisfaction rating / - / >3.75
No. of academic deptartments contributing / 15 / 17 / 28

Table 3: Performance Indicators developed by stakeholders in May 2012

In particular this project has highlighted the important need for data relating to the following important areas:

  • Student satisfaction at the end of the research degree (measured at graduation)
  • Supervisor satisfaction about Graduate School support and development
  • Funders, employers and sponsors satisfaction
  • External examiner comments from viva and assessment events
  • Student career path data

4.3Strategic contribution measures

When this project commenced it was noticeable that aspirations for research at the strategic level were implicit and tacit and it was not possible to find clear expression of them in institutional or faculty strategic documents. However, over the duration of this project,senior managers set out to develop an institutional research strategy and three of the University Research strategy objectives that were ultimately formulated have direct relevance to this project.

  • To recruit, develop and retain high-quality postgraduate research degree students and to provide them with a stimulating and supportive research environment and development opportunities.
  • To encourage all staff and research degree students to reach their full research potential and to support their career development.
  • To foster the development of research skills in students and to engage them in research, where appropriate, through taught programmes.

Akey learning point from this project is that metrics and measures can only partially assess the achievement of objectives such as these. Therefore, in order to enable an assessment of contribution to these objectivesa strategic review group was formed to assess the contribution of doctoral education processes to the Research strategy and identify ways to enhance this contribution. In accordance with the stakeholder approach the group comprised members of academic staff, research students, tutors; and representatives of each faculty as well as other important service providers within the institution such as: academic development; library; careers and employability. The conclusions of this group were that:

  • Dissemination - Evaluation data, highlighting the robust quality of doctoral education and development at this HEI donot adequately feature in marketing communication strategies, directed internally or externally with a negative consequence for strategic objectives relating to research degree student recruitment.
  • Consistency - Although evaluation data suggest that doctoral education at the HEI makes a robust contribution to the research environment of the university, the data indicated that this is unevenly distributed across different modes of study and departments; part-time and ‘mid-programme’ studentsare less involved that other student groups.
  • Stakeholder engagement - Effective doctoral education relies on the active participation of a range of stakeholders and uneven patterns of involvement are evident in this HEI leading to inconsistencies in the research environment.
  1. Conclusion

The concluding section to this report relates to the final project objective and examines the challenges involved in implementing the proposed framework.

First, although a pluralistic and multi-stakeholder perspective for evaluation is conceptually desirable, this project has indicated the multiple and often ambiguous perspectives that this can generate. Second, the project demonstrates the importance of environmental context in shaping approaches to evaluation. In spite of its explicit constructivist intent, the proposed approach had to be adapted in response to the institutional imperative of operational efficiency, throughput and student satisfaction. Third, although a critique of the IF approach is its time consuming nature, the framework piloted here proved to be equally (possibly more) time consuming, requiring the commitment and ’ownership’ of a wider variety of groups and the generation of new data to reflect a range of different ‘positions’.

This pilot process has illuminated the practical difficulties of moving beyond the hierarchical model of evaluation as currently encapsulated by the Vitae Impact Framework. It highlights the complexities involved in trying to build and enact one inclusive, multi-dimensional and pluralistic model, grounded in different stakeholder perspectives. In place of a ‘one size fits all’ model, which this project aspired to, it may be preferable to adopt an open systems approach, grounded in a critical realist perspective (Burgoyne, 2008) with a process orientated approach to the development of causal explanation. Large and small-scale empirically based evaluation are possible, making use of both qualitative and quantitative insights to better understand the initial conditions, sequences and combinations of processes that contribute to effective doctoral education and which, taken collectively might explain variable outcomes in doctoral education (Syed et al, 2010).

  1. Dissemination

A number of conference papers and presentations have resulted from this project at various stages of its enactment. These include:

Anderson, V. (2012a) A research-led review of doctoral education in a UK Business School, presented to ‘Innovation in challenging times’: Association of Business Schools Learning and Teaching Conference, 24-25 April 2012, Manchester, UK.

Anderson, V. (2012b) Falling between the cracks: A scholarly-practitioner approach to the evaluation of doctoral education, presented to the 13th International Conference on HRD Research and Practice across Europe, 23rd – 25th May, 2012, UniversidadeLusiada de Vila Nova de Famalicao.

Anderson, V. and Gilmore, S. (2012) Evaluating doctoral education: a conceptual paper, presented to 11th European Conference of Research Methods, 28-29 June 2012, University of Bolton, UK.

In addition a journal article is currently being finalised which is hoped will be of interest to one of the following journals: Studies in Higher Education, Management Learning; Higher Education Quarterly.

References

Anderson, V. (2007) The value of learning: from return on investment to return on expectation. London: Chartered Institute of Personnel and Development

Anderson, V. (2012a) A research-led review of doctoral education in a UK Business School, presented to ‘Innovation in challenging times’: Association of Business Schools Learning and Teaching Conference, 24-25 April 2012, Manchester, UK.

Anderson, V. (2012b) Falling between the cracks: A scholarly-practitioner approach to the evaluation of doctoral education, presented to the 13th International Conference on HRD Research and Practice across Europe, 23rd – 25th May, 2012, UniversidadeLusiada de Vila Nova de Famalicao.