Talking about Quality: a case study of courses that have enhanced dialogue with students about course quality.

Andrew Lewis

Department of Continuing Education, City University.

Paper presented at Higher Education Close Up, an international conference from 6-8 July 1998 at University of Central Lancashire, Preston. This conference is jointly hosted by the Department of Educational Research, Lancaster University and the Department of Education Studies, University of Central Lancashire and is supported by the Society for Research into Higher Education

The Research

Feedback from students on course quality is now an established feature of requirements for quality monitoring and quality enhancement in higher education in the UK (HEFCE, 1994) (HEQC 1993) (QAA, 1998) and systematic surveys of student opinion, such as that developed at the University of Central England (Harvey et al, 1997), have become commonplace. These requirements can be seen as an expression of quality management springing from an understanding of higher education rooted in technicalrationalism (Parker, 1997). It has led most universities to be prescriptive about their arrangements for obtaining feedback from students.

Where feedback from students on course quality is to be used for quality enhancement the shape and focus of the methods used to obtain feedback will depend on the priorities of those designing the methods. The hypothesis tested by the research described here was that, where staff and students at the level of the course are involved in the design of the methods, a key influence on the way in which they obtain and structure feedback on the student experience will be the culture of their subject discipline, though other course characteristics such as course level, mode of course, degree of vocational focus, size of cohort, course venue, as well as university context, may play a part.

The research project therefore investigated the variety of ways in which staff in higher education consult students about their experience of the quality of university provision within particular subject categories. The study, carried out over a period of two years between 1995 and 1997, focused on 100 courses in two universities, One of these had been a chartered university prior to the 1992 Further and Higher Education Act (a pre1992 university, here called University A) and the other had been a polytechnic (a 1992 university, here called University B). A variety of disciplines were included in the sample to try and identify patterns in the way in which the methods of student consultation were designed and implemented. The course directors of all 100 courses included in the study were interviewed, either by telephone or, where appropriate, in face to face sessions. Administrators and students were also interviewed and as many examples of paperbased instruments as possible, drawn from courses within the sample, were analysed for comparative purposes. What emerged was a picture of a diversity of practice in which there were clear differences between academic disciplines in their responses to the formal requirements of the universities.

Describing and Classifying Courses in Two Universities

The sample of 100 courses was constructed on the basis of dimensions that the literature had suggested might be relevant to the selection and use of a feedback methodology. Various writers have commented on the fact that academic, professional and learning cultures appear to vary between different subject disciplines. Becher (1984) defines culture for these purposes as a 'shared way of thinking and a collective way of behaving' and he

identifies cultures within different academic disciplines (Becher, 1989). He also identifies a socialisation process in which not only academic staff but also students learn the system of values that has been adopted by the culture.

In interviews with 'arts' and 'science' students Thomas (1990) found that the former emphasised freedom of opinion as an important characteristic of the subject and subjects in this category were able to tolerate dissent in a way that science subjects were not. While the sciences are often regarded as having a methodological integrity which separate them as disciplines, Pantin (1968) makes the further distinction between the restricted and the unrestricted sciences. The latter, for example, includes Biology as a subject in which inquiry can be pursued into any other science whatever.

Biglan (1973) attempted to go beyond the well rehearsed distinction between the sciences and the arts by proposing a three dimensional classification of subjects. In his investigation into the influence of disciplinary cultures with particular reference to students' learning styles, Kolb (1981, 1984) was concerned to assess the fit between an individual student's learning style and a particular academic subject's sub-culture. Learning styles were grouped along the dimensions abstract/concrete and active/reflective and this led to the identification of four prevalent types of learning style, each of which had particular strengths in the cultures of different subject areas. Thus experiential learning theory was used as a way of describing differences in the inquiry norms of academic disciplines. This was used to map disciplines on a two dimensional space, abstract/concrete and active/reflective, which showed great similarity to Biglan's classification on the hard/soft and pure/applied dimensions. There have, however been criticisms of Kolb's Learning Styles Inventory. Newstead (1992), for example, reports a number of studies showing mixed results using the Inventory and confirmed it as relatively unreliable in her own experiment. The proposition that individuals can be associated with particular learning styles also ignores evidence that context can influence learning (Harvey and Knight, 1996).

There remains the problem of identifying precisely which features of feedback methods would identify them as associated with a particular discipline culture. Kolb's characterisation of different individual learning styles and inquiry processes for discipline cultures is useful in this context. In his analysis the culture of disciplines in the first group, the science based professions, the process is basically analytical, involving the understanding of the whole by the identification of the component parts. The individual skills of the people Kolb describes as 'Convergers' are 'Abstract Conceptualisation' and 'Active Experimentation'. The focus is on things rather than people.

The second group of disciplines has quantitative model building as a typical inquiry method. People in this culture are termed 'Assimilators' and have 'Abstract Conceptualisation' and 'Reflective Observation' as their dominant learning abilities. There is less concern with the practical use of theories.

The third group includes the social professions where the approach is pragmatic and concerned with workability. Case studies are a feature of the method of inquiry. In individual terms people might be 'Divergers' with strong imaginative ability or 'Accommodators' who show flexibility in their handling of knowledge in relation to practical situations.

The fourth group, the social humanistic fields, strive for understanding through the comprehension of the totality of the object of study. The learning style tends to be that of 'Divergers'.

In more specific terms one would expect to find some concern with feedback from students on the way in which the course has tackled professional and field practice in discipline groups one and three. For discipline groups two and four there should be a concern with success in teaching the conceptual principles which underlie the subject. Furthermore, groups one and two fall within Biglan's original area of the physical sciences where Biglan, referring to Kuhn, expected to find

....the existence of paradigms that specify the appropriate problems for study and the appropriate methods to be used.

(Biglan, 1973:195)

We would expect to find some concern in the feedback mechanism with the teaching of these paradigms.

The courses in the sample were distributed among the four Biglan/Kolb dimensions as follows (Table 1):

Biglan/Kolb Category / University A
as % of sample / University B
as % of sample
1) Hard (Abstract)- Applied (Active) / 46 / 24
2) Hard (Abstract)- Pure (Reflective) / 12 / 8
3) Soft (Concrete)- Applied (Active) / 24 / 44
4) Soft (Concrete)- Pure (Reflective) / 18 / 24
Total / 100 / 100

Table 1 Distribution of courses across the Biglan/Kolb dimensions.

Note: Each of the categories of the sampling criteria used is shown above as a percentage of the total sample.

There is a need for caution in the analysis of data resulting from this sampling. The sample was constructed to represent the courses in Biglan and Kolb dimensions in roughly the proportion that they occurred in the two universities, although the distribution is skewed by the need to represent other variables discussed below. Any comparisons of practice between the two universities would have to take account of this. The Biglan/Kolb dimension Soft (Concrete)-Applied (Active) is the largest category overall but it is more strongly represented at University B than at University A. There is therefore, for example, a danger that any aspect of feedback methods strongly represented at University B would appear more among courses in this Biglan/Kolb dimension and vice versa.

Since criticisms of the Learning Styles Inventory casts doubt on the theoretical position that supports it the current research used other classifications of subject areas alongside it. One of these was the broad Organic/NonOrganic dimension also proposed by Biglan and courses in the sample were classified in terms of this (Table 2).

Biglan Dimension / University A
as % of sample / University B
as % of sample
A) Organic / 38 / 56
B) NonOrganic / 62 / 44
Total / 100 / 100

Table 2 Distribution of courses across the Biglan dimensions.

Courses were also classified into the 16 discipline categories used by the Higher Education Statistical Agency (HESA) (Table 3). It should be noted that agriculture and physical sciences were not represented in the two universities in the study and certain other disciplines were not strongly represented.

HESA Category / University A
as % of sample / University B
as % of sample
A Medicine and Dentistry / 0 / 0
B Subjects allied to medicine / 20 / 12
C Biological sciences / 0 / 8
D Agriculture and related subjects / 0 / 0
F Physical Sciences / 0 / 0
G Mathematical sciences and informatics / 26 / 6
H, J Engineering and technology / 14 / 18
K Architecture / 0 / 4
L, M Social Studies / 14 / 24
N Business and administrative studies / 14 / 8
P Mass communication and documentation / 4 / 2
Q, R, T Languages and related disciplines / 0 / 2
V Humanities / 0 / 4
W Creative arts / 6 / 2
X Education and leisure / 0 / 6
Y Combined or general other courses / 2 / 4
Total / 100 / 100

Table 3 Distribution of courses across the HESA categories.

Gathering Feedback from Students

Research into the influence of course related factors on the choice of methods to obtain feedback from students was made possible because neither of the two universities in the study had opted to rely solely on a university wide, centrally designed procedure. Both left the design of methods for obtaining feedback to faculties and in many cases this responsibility was further delegated to departments or to the course team. The only central requirement for both universities was that the methods adopted should include student representation on the course committee and that some form of survey instrument should be used to systematically gather students' views on course quality. It is interesting that in only one of the 100 courses in the sample were students actively consulted about the design of a questionnaire for surveying student opinion though students did influence the evolution of face to face methods of obtaining feedback.

The overall mix of methods for gathering feedback that course directors identified for their courses were recorded using response options that would allow courses to be grouped on the basis of the style of feedback. The options listed covered all of the varieties of feedback organisation adopted in the sample courses:

•student representatives on course committee

•questionnaire and student representatives

•student representatives and enhanced methods

•questionnaire, representatives and enhanced methods

•questionnaire only.

Both universities had established procedures for obtaining feedback from students. These consisted of student representatives attending course committees and some form of locally designed written instrument for surveying student opinion. Enhancing these methods was taken to include any procedure that enhanced the level of feedback beyond the minimum requirements set out by the university and could include group consultations, innovative assessment procedures which incorporated feedback on the student experience and year group meetings.

Enhancing Feedback

The courses in the two universities that had incorporated enhanced methods for obtaining feedback are listed by subject and Biglan/Kolb dimension in Table 4. In the table the Biglan/Kolb dimensions were coded as follows:

1) Hard (Abstract)-Applied (Active)

2) Hard (Abstract)-Pure (Reflective)

3) Soft (Concrete)-Applied (Active)

4) Soft (Concrete)-Pure (Reflective)

Course Name Biglan/Kolb Organic/NonOrganic Level

Dimension DimensionOf Course

UDS Business Data Processing1BUG

MSc Social Research Methods and Statistics2BPG

PGCE Further and Higher Education3APG

PGCE (Primary)3APG

BSc Psychology3AUG

MSc Counselling Psychology3APG

MSc Educational Psychology3APG

MSc Clinical Psychology3APG

BSc Architecture3BUG

MSc/Dip Disability Management in Work 3APG

BSc Clinical Communication3AUG

MSc Counselling Psychology3APG

MSc Health Psychology3APG

Clinical Communication Studies - Prep year3AUG

MSc Radiography3APG

UDS Fashion and Marketing3BUG

Modular Postgraduate Programme4APG

UDS Fine Art4BUG

BA Dance Theatre4AUG

RM Diploma in Midwifery Studies4AUG

Diploma in Nursing Studies4AUG

UDS Languages4BUG

BSc Women's Studies4AUG

Combined Social Science4AUG

BSc Sociology and Philosophy4BUG

BSc/UG Dip Nursing (ENB Award)4AUG

BA Acting4AUG

BA Journalism4AUG

UDS Cultural Studies4AUG

Table 4 Courses with enhanced methods for obtaining feedback.

This indicated that teachers in some subject areas, particularly those that could be identified, using the Biglan/Kolb classification, as Soft (Concrete), were more likely to supplement the prescribed, 'formal', methods for obtaining feedback from students with additional methods. The latter obtained a wider range of feedback and facilitated an enhanced dialogue with students. The differences here are significant. The chisquare test in Crosstabs was used to indicate this, although the small number of cases in some of the cells meant that these results had to be treated with caution. Placing courses in the Biglan/Kolb dimensions, 41% of courses that shared the Soft (Concrete) dimension had chosen to enhance the methods by which they sought feedback; only 5% of courses in the Hard (Abstract) dimension had done so (p = .00). Using Biglan's Organic/NonOrganic dimensions, 41% of courses in the Organic dimension chose enhanced methods; only 11% of those in the NonOrganic dimension did so (p = .00).

The degree to which course directors in engineering subjects in both universities were uninterested in the student experience was quite striking. Although, like many in the human sciences, they were focused on a syllabus largely dictated by the need to meet the standards of a qualification for professional practice, unlike those in the human sciences the meaning that the course would have to each cohort that enrolled became, for them, peripheral:

There are no reps since this is only a one year course with only two teaching terms for full time students. The student body is very transient.

(Course Director, MSc Civil Engineering Structures, University A)

Students whose courses contained a substantial amount of laboratory and workshop practice may have found themselves working in isolation compared with other students and this would have meant fewer opportunities for the formation of a learning community. At the same time, however, a number of course directors were able to mention the opportunities that laboratories and workshops created for gaining informal feedback from students.

In many of the courses in which there were no additional methods for obtaining feedback in the hard sciences, technology and mathematics there was a lack of appreciation of the potential of staff/student dialogue generally:

The value of informal feedback is questionable. Is it representative? Students have to come to the Subject Area Coordinator's office and they come with 'gripes' ie they have to identify an issue and then come to see the tutor. Students are very ready to complain.

(Course Director, BSc Business Information Systems, University B)

Sometimes even a lack of skill in facilitating group work seemed to be a problem:

We have tried going to the individual lectures to get feedback but this is not satisfactory because everyone speaks at once.

(Course Director, BSc Information Technology, University B).

The numbers of cases in some of the HESA categories were very small, so a comparison was made between the four categories with the largest number of cases; subjects allied to medicine, mathematical sciences and informatics, engineering and technology and social studies. As with the Biglan/Kolb dimensions, there is a relationship between academic areas and the methodology selected (p = .05). Courses in subjects allied to medicine and in social studies categories were more likely to have enhanced feedback methods.

The majority of those courses in both universities which were in the HESA subject area Allied to Medicine and Nursing, and which were placed in the defined Biglan/Kolb dimension Soft (Concrete), had supplemented the requirements of the university with locally designed, enhanced methods. One nonepistemological explanation for this could be the greater and earlier exposure that health professionals have had to quality assurance methodologies. As far back as the early 1980s health care became one of the first public service sectors in which quality schemes were initiated and nonphysician health practice professionals have been particularly active in encouraging quality assurance (Whittington and Ellis, 1993). Furthermore there is a long tradition of rigorous professional training for health professionals so that the maintenance of standards, which is part of the professional culture, is built into the training process. Above all Whittington and Ellis point to the increasing use of cyclical models for quality improvement in which the process is informed by setting standards, task, review and reform. These techniques, incorporating patient satisfaction surveys, can be transferred more or less directly to the teaching of professionals in higher education. This observation would also agree with the claim by Sarvimäki (1988) that health practice professionals would have a particular concern to transmit their common culture to ensure the cohesiveness of a subject area that embraced more than one discipline.