Transforming Quality: 7th Annual Quality in Higher Education Seminar

Transforming the transformers: the impact of external quality audit on universities and ADUs.

Dr Calvin Smith and Dr Debra Herbert
The Teaching and Educational Development Institute, The University of Queensland.

Outline

History of University of Queensland responses to quality assessment

The response of the University of Queensland (UQ) to the advent of external quality audit link to the well-documented problems that surround the use of the course experience questionnaire (CEQ) across Australian universities. Early days in quality measurement in higher education in Australia involved a government imperative based on nation-wide data gathering using the CEQ (Ramsden, 1991). This instrument became a central performance indicator of the quality of courses in universities, and it’s use led to the production of poorly defensible league tables that ranked university against university on a programme-by-programme basis. There has been quite a lot of dissent and consternation over this procedure (Long & Johnson, 1997)(Curtis, 1999).

From our own perspective, there were two main issues in the use of the CEQ as a performance indicator that we have addressed differently at UQ:

  • lack of curriculum specificity and time lag – solution being the continuous curriculum review model (see Smith, Herbert, Robinson, & Watt, 2001).
  • too narrow a focus on classroom experiences to represent the whole experience of students at a university – solution being the development of a comprehensive student experience survey (see Terry et al., 1999).

In recent times the government has responded to these same issues by making some changes to the way it uses the CEQ and to the quality of the measurement represented by the CEQ. The government’s response has been to fund research to enhance the instrument to broaden out the scope of what it inquires into (see McInnis, Griffin, James, & Coates, 2001)(McInnis, James, & Hartley, 2000), and to allow some small degree of customisation in terms of the scales universities can choose to have applied to their graduates.

Our own internal response has developed into a full and comprehensive quality monitoring approach with the advent of a substantial student survey, the UQ Student Experience Survey (UQSES), covering quality of courses, teaching, facilities and resources (physical, social and learning supportive), the achievement of generic skills and attributes, as well as some benchmarking scales from the extensions to the CEQ (mentioned above). The Academic Development Unit (ADU) at UQ has played a central role in the development, administration and reporting of this survey.

External Quality Audit in Australia

Our response is consistent with the system of quality audit that has been adopted in Australia (Woodhouse, 2001), in that we have external quality audit, not external quality assessment. It is the university’s quality assessment and enhancement procedures that the external agency will audit, not the quality of the teaching and learning itself. Consequently, our response focuses on the quality of our quality assessment and enhancement procedures. Some of these in turn focus on the quality of the student experience or on student outcomes or on the quality of teaching and learning. The aspect if this overall process that is undergoing development is the transformative aspect, that component of the process in which we respond to the data we gather on quality. For this part of the process the university’s ADU begins to play a central role.

Role of the Academic Development Unit in transforming quality

So far, the University of Queensland’s ADU has been involved in the processes related to measuring quality. It has had significant input into the design of measurement devices and procedures for the collection of information about quality, at least in terms of student experiences, some student outcomes and the quality of teaching and learning (such as programme or curriculum review processes, student evaluation of teaching and courses, and the UQSES).

We have noted elsewhere (Smith & Herbert, 2001) that external quality audit will impact directly and indirectly on the role and functions of ADUs in universities. The direct impact on ADUs is the obvious one, where through the mechanism of external quality audit, the university will have to respond to quality issues. Sometimes this well be done through the ADU, that is, the ADU will be involved in implementing measures designed to improve the quality of teaching and learning.

If, when auditors ask a university what its quality enhancement procedures are, part of their answer is to indicate that they have an ADU whose role it is to improve the quality of teaching and learning, then the auditors may well ask the ADU to demonstrate the university’s claim that they achieve these ends. This is a fairly dramatic turn of events for ADUs and there has never before been the need to so explicitly demonstrate the positive impact the ADU has on curriculum development, enhancements in the quality of teaching or in the students’ experience of studying at the university.

Academic Development Units themselves will have to be effective and thoroughgoing self-evaluators in terms of the quality of their product too, if the work they perform is to form part of the university’s defence to auditors that it has quality enhancement procedures in place. That is, the ADUs will have to become increasingly involved in measuring or monitoring the impact of their work on the quality of the curriculum and students’ experience. This will require the ADU to gaze upon itself. This is a quite new pressure on ADUs and it involves the conception and measurement of the various quality enhancing strategies in which ADUs engage. We will discuss this new development further in the paper.

References

Curtis, D., 1999, ‘The course experience questionnaire as an institutional performance indicator’, paper presented at the Cornerstones: The 1999 International HERDSA Conference, Melbourne, Australia.

Long, M., & Johnson, T., 1997, Influences on the Course Experience Questionnaire Scales (Canberra, Department of Employment, Education, Training and Youth Affairs).

McInnis, C., Griffin, P., James, R., & Coates, H., 2001, Development of the Course Experience Questionnaire (CEQ), (Canberra, Department of Employment, Education, Training and Youth Affairs).

McInnis, C., James, R., & Hartley, R., 2000, Trends in the First-Year Experience, (Canberra, Department of Employment, Education, Training and Youth Affairs).

Ramsden, P., 1991, ‘A performance indicator of teaching quality in higher education: The course experience questionnaire’, Studies in Higher Education, 16, 129–50.

Smith, C. D., & Herbert, D., 2001, ‘Creating a quality culture through partnerships: dancing with wolves or throwing a cat among the pigeons?’, paper presented at the Learning Partnerships: 24th International HERDSA Conference, Newcastle.

Smith, C. D., Herbert, D., Robinson, W., & Watt, K. (2001). Quality Assurance Through a Continuous Curriculum Review (CCR) Strategy: Reflections on a Pilot Project. Assessment and Evaluation in Higher Education, 26(5), 489-502.

Terry, D., Lipp, O., Herbert, D., Smith, C. D., Martin, D., Alexander, H., & Carrick, P., 1999, ‘Investigating alternate performance indicators to assess teaching quality’, in S. Powell & M. Siksna (Eds.), A Year of Reflection: The 1999 Action Learning Program, pp. 60–68, (Brisbane: The Teaching and Educational Development Institute, The University of Queensland).

Woodhouse, D., 2001, Australian Universities Quality Agency: Audit Manual. Canberra: AUQA.