Draft Response to the HEFCE Consultation on the RAE

from the University of Surrey

  1. Role, Purpose and Principles
  • There is a need for the RAE as it is difficult to see how the funding council could distribute research funds selectively without some form of research assessment. This underpins the dual support system of funding which we also support.
  • The RAE exercise is one that has been based upon grading research quality. This together with a volume measure has been used to define research support from the Research Councils. By and large the process of determining research quality has been fair and is accepted by the academic community. Most of the argument and criticism has arisen from the mapping of quality into resources. We recommend that there is much more transparency to this process of allocating QR.
  • HEFCE has not, in the past, entered into the policy arena in mapping quality to resources but we feel that the state of research in some critical areas to the UK economy now warrants a review of this policy.
  • We would fully endorse the fundamental principles on which a selective research funding methodology should be based as adumbrated by the UUK preliminary response (para 21 – 1/02/126(a)). We would further recommend that this be the steering criteria adopted by the Committee in producing a new process.
  1. Critique of RAE 2001
  • The introduction of the RAE process has had positive affects on the research quality and quantity in the UK. There are many metrics to support this including citation analyses, impact analysis of journals and international comparative league tables.
  • The fact that there have been movements in both directions (up and down) in virtually all of the UoAs indicates that the process is dynamic and reflects change.
  • There is evidence of non-uniformity across subject groups and suspicion that preservation of particular disciplines played a part in the 2001 gradings.
  • Other areas that come under criticism are:
    – games played with volume (these can be eradicated by insisting on all staff being submitted)
    – difficulties in assessing multidisciplinary activities
    – UoAs not well matched to current institution groupings
    – insufficient definition of criteria from individual panels
    – insufficient precision in definition of ‘quality’ for output
  • We feel that the resultant rankings were in the main fair but the resultant inability of HEFCE to fully fund the outcome has blurred an objective assessment of the process.
  1. The Process
  • We do not feel that a system which attempts to combine the assessment of teaching and research together would be feasible or desirable.
  • Our first reaction is a preference for a mix of expert (subjective) assessment and a clearly specified range of objective data. The weighting of the use of subjective versus objective data and the categories of objective data will vary from UoA to UoA (research income, different ‘types’ of publications, citations/citation indices, PGR completions etc and maybe others).
  • The above is very similar to the process employed in the 2001 exercise but its application was very ad-hoc across the panels. We would suggest that the process could be improved by prior publication by the panels of: (i) the set of auditable objectives, (ii) the weightings given to them. In the last exercise the ‘instruction book’ was far too generic to be useful.
  • We are not in favour of adopting other forms of assessment based solely on: metrics/historical data, self assessment. These are both inaccurate and open to manipulation.
  • We favour a scheme based mainly upon retrospective assessment but with some weighting to prospective assessment, as operated in the 2001 exercise.
  • We feel that the assessment of multidisciplinary research is going to need special attention with small groups of expert assessors appointed to them rather than the cross-reference panel approach, which did not work.
  • We support the view to review the UoAs so that they more closely map the current organisation in HEIs – even if this leads to more UoAs. We would not support a reduction in UoAs. Assessment on the basis of disciplines (UoAs) should be retained and we would not support wider groupings, and especially not whole institution assessments.
  • There needs to be more precise definition of ‘quality’ in the panels. In particular, we would support in Science and Engineering the inclusion of quality in other outputs than published papers, for instance in enterprise areas. Even with published papers there needs to be a qualification as to impact – ‘what has resulted from the work’, although we recognise that there can be a significant time lag for some impacts to emerge.
  • We are in favour of using some external metrics such as EU income, involvement in national and international programmes etc.
  • As already noted, most of the ‘games’ have been played around volume measures and we would support a move to submission of all staff with the panels determining percentage of research active.
  • A system conducted on an approximately 5 year cycle by a single national body at a single instance in time removes many of the potential objections and is much more acceptable to the community at large. Our Arts colleagues would prefer a longer cycle.
  • If one were to look at reducing the resource in time and effort for the RAE, we would suggest a light touch for those gaining 5* in the last two exercises and this could be purely metric based. In addition, one might consider a lower cut-off limit to reduce the number of submissions.
  • We would support the retention of a similar grading system to that currently employed but perhaps modified slightly in subjective terms for better clarity and stretched at the top end to differentiate the real international stars. We consider that Panels can rank order UoAs fairly accurately but there needs to be more thought given to grade assignments in terms of national/International excellence if indeed grades are to be retained.
  • We recognise the difficulties that ensue from having hard grade boundaries and the transitional funding problems for those moving across them. Some consideration could be given to a phasing in of such transitional funding.
  • Moves to encourage the joint submissions across institutional boundaries where there is genuine collaborative working (say on a regional basis) would be welcomed.