Office of Student Affairs
Corporate Management Team
Student Experience Committee
What our students are telling us – and what we are doing about it
2006 was the second year in which we received feedback from two large-scale surveys of student opinion: 1664 final year undergraduate students (including students at regional partner colleges) took part in the first-ever National Student Survey. 1253students drawn from all levels of study (but only from “core” Anglia Ruskin) responded to our own Student Experience Survey.
The Student Experience Survey has been widely circulated and discussed in a number of university bodies. An outline paper on the results of the NSS has similarly been discussed and Faculty colleagues have analysed the data published by subject on the TQI website. The two surveys have been discussed by the then Senior Management Group, by the Student Experience Committee, by all Faculties and by Support Services. The Students’ Union has been consulted both formally and informally. Our evaluation of the outcomes has been in two phases this year; initially nine preliminary action points were identified in discussions with Faculties and University Registry with the intention of making immediate improvements which would have an impact in semester two. The second phase has revisited Faculties but has also included the Students Union and relevant Support Services.
The Vice Chancellor has asked Faculties and Support Services to report back on matters of significance to them and to propose improvements in areas identified by the surveys. Action plans based on these proposals will be built into next year’s budgets so that any necessary action will have begun within a year of the two survey dates.
Dr Paul McHugh
Director of Student Affairs
5th February 2007
Student Experience Survey 2006
1.The 2006 survey was againcarried out by the Centre for Research and Evaluation at SheffieldHallamUniversity and uses a well-tried methodology developed by its Director, Prof. Lee Harvey. Students were asked to express their satisfaction (or not) with an aspect of our service and then were asked to say how important this aspect was to them. The results are tabulated in an A to E range but with a further refinement in that capital letters are used for very important aspects, lower case for important and bracketed lower case for not so important ones. While all returns are considered, it is those in capitals which tell us where we are doing significantly well or (fortunately less frequently) significantly badly.
2.Immediate improvement has been required in the case of any aspect of service scoring:
EUrgent need for immediate action
DAction in these areas has high priority
CThis area to be targeted for future improvement
There are, in fact, very few scores in this range. The overall experience is a positive one: Over aquarter(27%) of students responding to the survey said they would definitely recommend APU to a friend (but this is down from a third in 2005), a further 52% said that they probably would. Against this 79% satisfaction rating, only 6% would definitely not recommend APU and a further 16% would probably not. The trend is very slightly in the “wrong” direction and we should not ignore this. Another useful test of how well we are doing is to look at the aspects of service in the order of importance which students attach to them. As last year, of the top ten aspects, 4 are scored at A and 6 at B. The highest aspect with a dis-satisfaction score (and it is an E for car parking again) comes in at 31, but this is lower down the scale than last year (22) which suggests that it is becoming a little less significant for our students.
3.Students were also invited to offer additional written comments and many did so. These comments are of course anonymous so that no particular student can be identified. 953 students commented and there were1759 separate comments, all of which have been read and analysed. The largest group were,as last year,about car parking (138), with comments on teaching quality (122) and the University Library (119) not far behind. Interestingly, the next highest category was communication (111) which is surely significant. We invited students to suggest one positive improvement which they would like to see and we received plenty of suggestions, some of which we should be able to implement. The additional commentshelp us better to understand the nature and urgency of issues; while some of them are exasperating or plain wrong-headed, the majority are thoughtful, pained and even humbling. University staff would do well to read them and I would suggest wide dissemination within Faculties.
4.There now follows an analysis of E, D and C scores together with the responses of the relevant Faculties and Support Services. A brief statement of the nature of the aspect is followed by comment from the Faculty or Unit. Attention is occasionally drawn to aspects where sub-sets of students have scored at C, D or E though these should be treated carefully as the size of respondent groups may, in some cases, be very small.
5.Learning and Teaching
Learning & teaching scores were generally good though slightly down on 2005. We do very well on two questions – development of communication skills and starting classes on time; otherwise scores are B, i.e. students regard L&T as very important and are satisfied.
Questions scoring C [very important but only adequate] and below were:
5.1Quality of placement/work experience where AIBS and S&T both score C; IHSC which scored C in 2005 has moved to A as a result of a successful action plan following last year’s survey.
5.2Expenses associated with placement/work experience. AIBS and IHSC both score C; is this simply a function of extra costs which are understandably resented? The IHSC placements action plan shows that a good deal of effort goes into minimising cost for students.
5.3Quality of feedback on assignments. AIBS and S&T both score C. This is disappointing because S&T has come down from B and AIBS remains at C despite its 2005 response: “AIBS undertakes to maintain its efforts and will work to disseminate and secure the adoption of good practice. Heads of Department have already taken up with relevant staff those instances where feedback has been deemed to be in need of improvement”. [SES feedback 2005]. The NSS also picks up quality of feedback.
5.4Promptness of feedback. Our worst L&T question. AIBS and ALSS score C and S&T has come down from C to D [very important but unsatisfactory]. This cause for concern was reflected in the NSS too. In semester one, 2006-07, we therefore re-affirmed our commitment to returning “in-semester” assessments within twenty working days and published (on student ANET and noticeboards) details of where to collect marked work (S&T at Cambridge were disappointingly unable to agree a simple process for this). We hope that this emphasis will be reflected in higher scores in 2007.
5.5Cost of course materials. ALSS scores C, presumably in the School of Art at Cambridge? Young, female 2nd year undergraduates appear to be the most concerned.
5.6Value for money. AIBS as last year, though the emphasis has shifted from postgraduates to 3rd year full-time undergraduates. Scores for this crucial question are otherwise B.
6.Student feedback
We asked questions about feedback from students in relation to pathways and in relation to modules. There was a clear campus differential with Essex students scoring at B and Cambridge students scoring at C. Younger students were less satisfied than older and this also correlates with the age profiles of the two campuses.
There are also clear differences between Faculties. IHSC and Education score B while AIBS and S&T score C. Associate Deans have agreed to review their Faculty approaches to the use of feedback in the light of this evidence.
A further question about student representation at pathway level had a similar response but students said that this was not as important to them as feedback.
7.University Library
The University Library’s ratings remain overwhelmingly positive. Students in all Faculties rated the Library overall at A and its staff at A for both competence and attitude towards customers. Almost all its aspects of service were regarded as very important and a high proportion of these were scored at A.
There is a striking difference between the preponderance of A scores at Essex and B scores at Cambridge. The library recognises this and explains it by noting that the Cambridge library was in the throes of a large-scale improvement programme during the fieldwork period in Spring 2006. The improvements as a result of this are:
- An improved study environment
- New open access PCs in an expanded Learning Zone
- The whole Library has been re-shelved
- Opening Hours have been modified
The Library expects that these improvement will have an impact on scores in SES 2007.
Although “access to recommended reading” scores B overall and over all Faculties, some 2nd and 3rd year undergraduate students reported this as C. Academic Liaison Librarians will be working with faculties on this; the key to availability lies in the Library receiving module guides in good time so as to be able to respond. Loan periods are periodically reviewed and the Library believes that it is sensitive to its readers’ opinions on this.
The Library website is actively managed so as to serve readers’ needs. There is an opportunity to make comments on every page of the website and changes are made in response to these comments; for instance an inappropriate search engine has recently been replaced.
8Information Technology facilities
There has been no significant change since last year. Ratings for IT provision were good, though usually at B rather than A. The only aspect which concerns some students is the cost of printing which is rated at C overall, and C in Faculties except IHSC. However, the latest comparative review of charges across Universities (for the period August 2004 to July 2005) shows our charges (currently 5p per B&W A4 and 30p per colour A4) were in line with other Universities, and C&ITS believe that this remains the case. It is possible that students feel that charges are high because they are printing much more then they used to; the average number of pages printed per FTE student has doubled over an eight-year period.
9Processes and Procedures
Students continue to tell us that they are unsure about how to complain. The position remains unchanged from last year: Students in AIBS, ALSS and S&T rated it at C and were likely to be full-time or sandwich undergraduates. Last year we revised and re-issued the standard complaints notice. Student ANET carries information on the procedure prominently This procedure seems less clear than those for extensions, mitigation and submission of assignments, but perhaps this is because this one is much less used than the others?
Students know where and how to submit assignments but there was some uncertainly about where and how to collect marked work amongst Cambridge students and particularly S&T students. We have therefore publicised more widely all information on collection and we intend to move to a position where marked work is collected from fewer offices which will remain open for longer periods. It is worrying, though, that it has proved difficult to collate accurate information and, as already noted, some arrangements are complex, and unfortunately this is particularly the case for S&T Cambridge!
Opening hours of administrative offices are now generally acceptable, but some students tell us that they are unsure as to which office might handle a particular issue and there is still some confusion about notification of results. The opening of the Student Information Centre in Cambridge provides a first port-of-call for students on that campus; staff in the SIC are effective at referring students to other offices. An SIC for Chelmsford will open in 2008. Notification of results is principally through e-vision which we believe students have no problems in using. These returns are therefore a puzzle which will be further investigated.
It is worth noting that although there are more C scores in this category than elsewhere, students have a generally good opinion (B scores) of our administrators and of our systems.
10.Information and Guidance for Students
Scores are good for most aspects of Information and Guidance except for Wise up!, the Anglia Ruskin newspaper which is liked but not thought important and for the availability of information about the new 15-30 curriculum which had poor scores and an unsatisfactory D from Cambridge students. We are reviewing the future of Wise up!as part of a wider review of communicating with students. The 15-30 score was very disappointing as a lot of effort had gone into communicating this. We now believe that this reflects the difficulty of interesting students in a future development which seems remote and which will not affect something like 25% of them. A curriculum website is being developed to carry messages to students about the implementation of the new curriculum.
11.Support Services for students
These services are not rated for importance since only those who have used them are prompted to evaluate. There has been a slight increase in satisfaction overall and many services score As amongst all categories of student. Childcare at Essex, however, scores a C, reflecting the recent closure of the campus nursery. We are supporting students to find appropriate childcare places at other nurseries and have retained a Childcare Adviser specifically to help with this. The Accommodation Service’s scores have regressed from B to C overall, with AIBS and IHSC students now rating at D. The service is examining its counter service provision and has upgraded kitchens and accommodation during 2006 as a response to this. It notes that an insufficiency of accommodation will usually result on lower scores for the service as a whole; thus a D from full-time undergraduates is unsurprising when we are unable to offer university accommodation to all who request it.
12.University and Students’ Union facilities
As last year, students are less concerned about facilities which they regard as only “important” or “not so important” as against “very important”. They see common rooms as being a weakness, particular in Cambridge; free text and anecdotal comment suggests that investment in pleasant places to sit and chat without necessarily having to enter a bar are what is required. We have re-organised the entrance and restaurant area at Rivermead in response to this. The balcony coffee bar and adjacent seating area in Cambridgeare also highly rated. We need to make similar provision as we further develop our estate.
SU and sports facilities are generally seen as satisfactory but are not important for all students. This and the campus differential are probably reflections of our student profile.
The two aspects which are regarded as “very important” are also well regarded: Eating and toilet facilities are generally scored at B; the Cambridge toilets are at A.
13.Your environment
Students think that our campus environments are good, though this isn’t a key issue for them. They tell us that they can get to campus easily enough and this is an important consideration at B. However, as last year, they rated Car Parking at E – that is to say very important and very unsatisfactory. In virtually all sub-sets it was also rated at E (Education students were a little less irritated at D – they also rated campus environment highly at A!).
The position at Cambridge remains as it did last year when we said:
“… it is important for students to realise and be reconciled to the fact that there is no possibility that parking can be provided at East Road. There is obviously no space……We need to publicise the reality of the Cambridge parking situation in an honest and upfront manner and the management of disabled parking needs to be overhauled and, again, better publicised since the position is more satisfactory than students appear to realise.”
We do not appear to have persuaded disabled students that they will be able to park on campus and we will increase our efforts to alert them to this.
The position at the Rivermead campus has improved since last year. We have opened a new car-park next to the Tindal building and have just completed phase A of the North car park. Together, these developments have added 374 spaces and, with the completion of the Health building, it should be possible to reclaim some construction site space for parking. However, we need to stress that the new campus will never be able to meet total demand, even if we were to provide spaces up to our planning limit of 901. Introducing a modest charge which has been set so as to be competitive with Chelmsford long-stay parking has had the desirable effect of reducing demand to some extent – which helps students who have to drive in, say for distance or family reasons, to find spaces.
National Student Survey 2006
1. / BackgroundFieldwork for the 2006 NSS was carried out between February and April 2006. Anglia Ruskin’s eligible sample consisted of 3,064 final year/final phase undergraduate students in all HEFCE and TDA funded programmes. NHS-funded students are not included. Regional partner college students are included.