9349
Access course feedback: the interactive effects of research
Kate Day, Centre for Teaching, Learning and Assessment and Joanna Highton, Centre for Continuing Education, University of Edinburgh
Overview of the research
Our collaborative research concerns the monitoring of an established Access Course, run jointly by the University of Edinburgh and Stevenson College, for entry to university studies in the arts and social sciences. The original focus of the monitoring project, which was additional to routine course evaluation, was to explore the effectiveness of the transitional nature of the course. How well was it preparing adult returners from various backgrounds for what they would experience as undergraduates – academically and otherwise? Undergraduate discontinuation rates and course results were crude measures and other information was anecdotal. We thus wanted to take a closer look at the workings of the course from the student perspective, and also to test the viability of tracking the experiences of access-route undergraduates. The aim was to clarify the strengths and weaknesses of the Access Course and so generate information and ideas useful to its further development. We went into the research enthusiastically. JH welcomed the opportunity to learn more about doing research, especially if the course which was her mainstream responsibility would thereby benefit. For KD it was a chance to re-engage in practical research in an interesting area. But we were rather less well placed as regards resources and clearly the work would be done in marginal time.[1]
In October 1990, the twenty-six students from the ’89-’90 Access Course cohort about to embark on studying at Edinburgh were invited to a reunion meeting. Twenty-three attended and filled out a self-description sheet indicating how, on a number of dimensions, they had changed during the access year. They also took away for completion a background information sheet and quite a lengthy questionnaire, which looked back over the course and took stock of their current intentions and expectations. In January ’91, the students were invited back to share and discuss the pilot phase findings, which were more widely disseminated in March at an Access-Related Research Forum. The main research phase involved the ’91-’92 Access course cohort (n=73). In November ’91 we briefed students on the monitoring project and gave out a somewhat shorter questionnaire (Q1). Queries included reasons for taking the course, any off-putting factors, family and friends’ attitudes, outside responsibilities, financial concerns, confidence in academic skills, time spent, sources of help and support, likes and dislikes about the course so far. 68% of students responded (13/25 men, 37/48 women). In February ’92 a probe questionnaire sent to selected Directors of Studies asked about the preparedness of former Access Course students for university study and any particular strengths or difficulties. The second student questionnaire (Q2) was administered in May ’92. It asked about workloads over the two terms, reactions to coursework extensions, availability of books, self-assessment capabilities, plus their two chosen subjects. The response rate dropped slightly to 63% (14 men, 31 women). The same self-description sheet used in the pilot was distributed in October ’92 and produced thirty-five replies.
Introduction to the paper
We wish to discuss the interactive effects of the research just outlined. This is not altogether easy because our research was only one of several sources of information about a course which views teaching and learning as a joint enterprise and so encourages open communication amongst all concerned. Our findings were thus set alongside and triangulated with what was already coming through from students, staff tutors and external examiners. Nevertheless, this research did exert specific influences which affected what happened on the course, the nature of the student experience and us as researchers. At the same time as the research was having an impact on the inter-connected web of course structures and processes, the learning opportunities available to students, and the confidence and commitment of staff as managers and teachers, the research was itself also being shaped. It is this dynamic and reciprocal relationship between research, the researchers and the researched that will now be illustrated.
Course design and development
A. The focus in all three questionnaires on students’ workload and acquisition of academic skills, and in the self-description sheets on the personal impact of the course, arose from concerns central to the transition function of the Access Course. How appropriately was the course managing to strike the difficult balance between, on the one hand, being sufficiently challenging and enough of a preparation for the demands of University work, and on the other hand satisfying the need of students still developing and unsure of their higher order abilities not to feel threatened and incapacitated by the amount and level of work required?
This was a live issue for actual and prospective students, the course management, and staff in the receiving institution, including Directors of Studies. And the research findings were able to give several useful pointers about relevant aspects of the student experience.
- Clusters of items from the self-description sheets showed that students perceived the course as moving them towards being more confident, (‘confident’-’determined’-’clear-sighted’-’interested’) and competent (‘knowledgeable-’organised’-’resourceful’-’analytical’). At the same time a half felt more or much more stressed, over forty per cent more or much more anxious, and almost a quarter less or much less healthy.
- As regards the development of confidence in specific competencies and the attendant support needs, the students indicated having less confidence in the more narrowly academic tasks, such as taking exams and writing essays, and rather more in areas, such as studying on their own and reading effectively, where they could presumably draw on more transferable skills.
- In the matter of how much time per week students spent on their coursework and preparation outwith classes, the consistent finding was that for a clear majority of students the load was close to or just under the target twenty hour mark. This confirmation was both reassuring to staff running the course and advising applicants, and useful in on-course guidance.
- The workload findings were also influential in helping to determine how much writing experience was appropriate during term 1 in order to equip students for the raised expectations and specialist demands of term 2. The original sixteen short assignments had resulted in students performing well subsequently, but feeling overstretched in the first term. Halving the number of assignments led to students experiencing difficulties in meeting the second term standards and did not reduce the sense of overload. Since the amount of time actually put in by most people remained close to twenty hours, the compromise solution was to revert to giving sixteen assignments, but to require completion of only twelve. This has upped performance levels and, by allowing for choice, enhanced the course’s promotion of student autonomy. The changes have also become part of the course’s folk history, demonstrating responsiveness to student needs and commitment to course aims. When asked in Q2 to compare how challenging the two terms were for coping with the total workload, there was a near-even 55%/45% split in student opinion. The rating of term 2 as more challenging for ‘planning out study time’ and ‘completing written work in time’ accords well with the intended progressive nature of the course.
B. A second prime concern, which stemmed from the involvement of a number of tutors and the multi-disciplinary nature of the course, was the harmonising of standards and practice so as to provide comparable sets of learning opportunities for students.
- One obvious tactic was to ask the same questions about each of the different courses, which was done in Q2. This produced rich data about the difficulty and amount of course material, presentational pace and style, the usefulness of handouts, tutor feedback and availability, whether the course was as interesting as expected and how students felt they were getting on. It was a useful check on, and sometimes provided explanations for, impressions gained by other means. The information helps receptive individuals to improve particular courses, and for newcomers it highlights ‘good practice’ (such as the blending of tutor input with opportunities for student participation). Knowing what subjects students took also made it possible to unpack, by means of cross-tabulations, general responses which otherwise were not very informative. A good example concerned ‘access to books and other resource materials’ which produced a 49% yes/49% no answer that concealed important subject variations.
- The issue of tutor feedback on student work, which was associated with efforts made to improve essay writing skills, was addressed in both Q1 and Q2. Students rated comments on scripts, the assessment grid, tutorial and individual discussion as useful. But their open-ended responses revealed that what had been intended as standard practice across the course was in some cases not happening or not working as envisaged. It became clear that students wanted but were reluctant to take up offers of individual sessions available on request, regardless of how approachable the tutor was or their desire for more feedback on intellectual development and potential. With this knowledge, JH as course coordinator was in a strong position to recommend changes which included requiring the use in term 1 of the summary assessment grid (revised to encourage more explicit tutor feedback and general progress comments), and the submission part way through term 2 of brief reports on individual students. In addition, the lecture sequence in first term was halted at two strategic points for formally timetabled individual tutor-student meetings, instead of relying on informal arrangements.
Students and the research process
From the outset we wanted students as collaborators whose reactions and responses were important in helping to shape the evolution of the course and the research. Accordingly we have consciously fed findings back to them, kept them in the picture about any consequential action contemplated or taken, and been alert for signs that participation in the research process might be having detrimental rather than positive effects.
- One side benefit has been the contribution made by the research to building up a sense of involvement in the course and of continuity among Access students. They know that what they have to say is listened to and gets taken into account. They also know something of how their contemporaries and predecessors have reacted and can take comfort from realising the ‘normality’ of, for instance, having a very mixed bunch of reasons for doing the course, finding it difficult to make UCCA choices, worrying about finances or personal relationships, doubting whether they are ‘really up to it’ and considering leaving the course.
- Effects in the reverse direction resulted from our growing awareness that once people become undergraduates they want the freedom to choose whether and when to retain or shed their access identity. A further questionnaire already drawn up began to feel unduly intrusive. It was important to keep the interests of students centre stage and so, for ethical as well as pragmatic reasons, we abandoned ideas of trying to keep close tabs on their undergraduate experiences and re-scoped the research.
The research process and us
The collaboration presented us each with different problems. For JH there was the tension between running the course and monitoring it, plus the danger of the technical aspects overwhelming and alienating her. For KD the project was at the periphery of formal professional responsibilities so that sustaining sufficiently close contact and doing the necessary ‘busy’ work was sometimes hard.
Yet there were several ways in which our different backgrounds and motivations turned out, as we had hoped, to be very productive.
- One was the welcome curb put on the temptation, which we both experienced, to pursue what was ‘interesting’ – from either a ‘knowledge about access students’ viewpoint or from a methodological perspective. The constraint operated because we effectively obliged one another to weigh up and to justify the practical costs/benefits likely to accompany particular lines of enquiry.
- A second gain was the sharing and exchange of expertise. This meant that as we grew more knowledgeable, about the course and research respectively, we became more effective in focusing and framing questions. Thus the number of areas which we failed to tap satisfactorily definitely declined and the questionnaires got shorter. They also became more collaborative in tone as we felt more able to preface specific questions with contextual information that helped respondents appreciate why certain aspects were being targetted.
- The third benefit was the holding up of mirrors to one another’s practice. ‘Naive’ questions were not only challenging but had the effect of causing us to articulate fundamental assumptions about the course or doing research. During the construction of Q2, for example, JH wanted to include a question to gauge perceptions of ‘locus of control’ in the learning process. KD’s attempt to link this general notion to some concrete feature of the course led to the identification of a proxy measure (the student’s self rating of ability to evaluate their own written work before submission for formal assessment). But it also set JH wondering whether the course was being sufficiently proactive in fostering the development of autonomy in learning. One of the practical results was further to encourage self-help groups, and the inclusion of guided study groups as an integral part of the new evening Access Course.
In conclusion, we have certainly learned a lot – about the course, about Access students, about research and about ourselves. Even this paper’s attempt to identify and trace through some of the elements within the powerful web of interactive effects has been instructive. Moreover it has been good to see the benefits which have accrued to the course and its participants as a result of the research.
[1] Statview on an Apple Macintosh was used for the quantitative analysis. We are grateful for the processing work done in the later stage