Talking Each Others Language: Are OfSTED and the LSC Really Singing From the Same Hymn Sheet?

Michael Hammond

Paper presented at Directions in Educational Research: Postgraduate Perspectives, University of Leicester, 23-24 July 2003

Abstract

This paper examines the effects of OfSTED inspections on FE colleges, and the Learning and Skills Councils ( LSC) possible use of them within the Further Education (FE) sector. It considers the ethos of teaching and learning in relation to OfSTED, and how this may require managers within FE colleges to manage. It then considers and compares the OfSTED ethos to the apparent LSC ideas relating to inspections and their potential aftermath particularly in relation to rationalisation of FE curriculum, and the way those LSC requirements may impact on college managers. The paper seeks to identify a dichotomy between the needs of OfSTED and the needs of the LSC for managers working in FE colleges, and suggests that although OfSTED and the LSC may think that they are singing from the same hymn sheet, the reality may be somewhat different. The paper suggests that this may cause managers in FE colleges difficulties in the management of objectives and staff.

Introduction

The LSC was given responsibility for the funding, planning and control of all FE colleges and private training providers offering training in return for Government funding. The LSC took over these responsibilities from the Further Education Funding Council (FEFC) and the Training and Enterprise Councils (TEC)in April 2001. Prior to the creation of the LSC, the FEFC had funded FE colleges, whereas the TEC’s had funded all private training providers (Ainley and Bailey, 1997). In composition, the LSC resembles both the FEFC and the TEC's in that it has a national centre in Coventry (like the FEFC), and forty-seven regional arms called Local Learning and Skills Councils (LLSC) which resemble the former TEC structure. These LLSC organisations have responsibility for implementing and managing regional policy subject to national policy directives determined by central Government through the national LSC.

Methodology

When undertaking a piece of policy research then the concept of ‘methodological eclecticism’ has been held to reign supreme (Finch, 1985, Troyna 1994). Published policy research has for example consisted of a methodology based entirely on interviews (MacPherson and Rabb, 1988). Primary source documentation was used as the sole source for data collection in policy research (Slater and Tapper, 1981). A ‘partial’ ethnographic approach to policy research has also been used by other researchers (Bowe, Ball and Gold ,1992, Walford and Miller, 1991). The researcher, in designing the methodology for this research used interviews and primary literature. The interviews were semi-structured, and involved a DfES Executive, an LLSC Chief Executive, an OfSTED inspector and an FE college manager. The primary literature took the form of Government DfEE/DfES publications, and LSC and OfSTED circulars (Cohen and Manion, 1994).

Setting the Political Agenda for OfSTED and ALI Inspections?

The Learning and Skills Act (2000) made the LSC responsible for managing the quality of provision in the new learning sector through the Office for Standards in Education (OfSTED), and a new Adult Learning Inspectorate (ALI) which was tasked with inspecting all other adult learning in the sector excluding full time sixteen to nineteen year old students. Initially, as part of developing a national programme for inspection for FE colleges, OfSTED and ALI engaged in five pilot inspections of FE colleges (OfSTED, 2001b, OfSTED, 2001c, OfSTED, 2001d, OfSTED 2001e, OfSTED 2001f). The FE colleges inspected generally failed to impress the new inspectors, and two of the five colleges were deemed to be offering inadequate quality of provision in respect of teaching and learning as well as weak management (OfSTED, 2001c, OfSTED, 2001e). As a result of these inspections, the Principals at two of the colleges took early retirement soon afterwards.

Only one college was deemed to be offering a programme that was outstanding, (grade 1) (OfSTED, 2001f). In addition, only twelve curriculum areas were offering curriculum that was deemed good (grade 2) throughout all the colleges, with the rest being satisfactory (thirty-three, grade 3), or unsatisfactory, (fourteen, grade 4). A comparison with the former FEFC inspections on those colleges would probably show that the number of grades awarded at one and two were much higher than under the new OfSTED regime. Politically, the LSC used the issue of quality of curriculum provision in FE through a radio Four ‘Today’ programme on quality standards in FE, where both OfSTED’s Chief Inspector and the LSC Chief Executive attacked FE colleges. The OfSTED Chief inspector went on record as saying:

“ We’ve inspected five colleges. Of those five, two have been given the tag of “inadequate”. In three of the five, the management was considered to be less than satisfactory and across all five colleges we found a rather large and in some cases disturbing amount of teaching which is less than satisfactory.” (Tomlinson, 2001)

The Chief Executive of the LSC however went much further, he said:

“We reckon about 40 per cent of the provision across the whole sector is just unacceptable in terms of the quality of the learning and the provision which takes place. And of that, we think about five per cent of the sector is appalling.” (Harwood, 2001a).

There was an almost instantaneous backlash from these comments from FE colleges with most of the wrath of the colleges being directed at the Chief Executive of the LSC (Nash, 2001) The Association of Colleges went straight on to the offensive against Harwood, in an open letter:

“We are currently as you know, preparing our bid for the spending review. This is, of course taking place at an extremely sensitive time, given all the strains and pressures caused by terrorism around the world. An inaccurately critical review of the sector has, therefore, an even greater potential for damage at this time.” (Gibson, 2001)

The LSC claimed however that the figures used by Harwood were justified because they came from the FEFC’s figures, which showed that eleven per cent of provision was outstanding, fifty one per cent was good, thirty two per cent satisfactory, with six per cent being unsatisfactory or very unsatisfactory. It has been suggested, that as the FEFC used a 1-5 scale, whereas OfSTED use a 1-7 scale when undertaking classroom observation, then the LSC was confusing OfSTED grade 4 which is satisfactory with FEFC grade 4, which is unsatisfactory (Nash, 2001).

An apology from Harwood given the political fall out was probably inevitable and Harwood duly wrote to 400 college heads both to apologise, and to infer that he had been quoted out of context. He said:

“The effect of my words on Radio 4, and their interpretation were neither what I intended nor wish to allow to persist. They were however, my words. I need not only make it clear what I meant but to say sorry for the effects they have had on you and your colleagues.” (Harwood, 2001b).

Pressure did however continue to bear down on Harwood for a time when in a letter to the AOC the BBC refuted an implicitly made suggestion by Harwood that they had misrepresented him, or taken his words out of context. For a short time, Harwood again faced calls for his resignation (Hook, 2001). Sanderson, the LSC National Chairman, simply confined himself to a public rebuke of his embattled Chief Executive in an interview with The Guardian. He said:

“The words chosen weren’t very smart and don’t give due credit to what I think is a very under funded part of the education sector.” (Kingston, 2001)

In the same article however, an allegation was made that the LSC national press office had contacted Local Learning and Skills Councils (LLSC’s) by e-mail on 29 September 2001 saying that they wanted the forty per cent of provision in FE Colleges is ‘inadequate’ story in the press (Kingston, 2001).

In focussing on the quality of curriculum within the FE college sector, the LSC were following a path already trod by central Government. David Blunkett the then Secretary of State for Education and Employment stated what he perceived to be the issues surrounding quality in FE. He said:

“In general further education too, there is excellence of which we can be proud. But there are weaknesses that result in too much variation in standards that we must tackle. Some relate to poor advice and guidance for young people about choices at 16, which I will touch on elsewhere, others relate to the standard of provision young people receive. The Government will continue to emphasise raising standards, and the Learning and Skills Council will intervene to support the weakest providers and offer constant challengers to the rest.” (DfEE, 2000, p1)

The Government then went on to say:

“We must continue to address deficiencies in colleges, causing concern those that independent inspection identifies as failing their students and the wider community. But those represent a small and declining minority. We have a much wider challenge to lever up standards across the great bulk of the colleges that have middling inspection grades, retention and achievement rates against the benchmarks that the FEFC has now established for them. We need together embedded a culture of continuous improvement. To recognise that what was ‘satisfactory’ last year will be ‘barely satisfactory’ next, not to discourage our staff by always complaining that the glass is half-empty, but to challenge and support them to do better. The fact that this is the general message across the UK economy does not make it any less applicable to those of us with responsibility for further education.” DfEE (2000,p24).

To address these issues, the LSC has instructed colleges to implement an annual self-assessment exercise and from that agree with their LLSC areas for improvement and targets. Colleges are required to produce a development plan that should set out how the provider aims to achieve excellence. The provider is also required to monitor continuously the implementation of the development plan (LSC 2001a). This self-assessment plan and the quality of classroom delivery are then inspected by OfSTED and/or ALI to determine the quality in relation to the plan. Inspection under the new framework is controlled by the Common Inspection Framework (OfSTED 2001a). This describes the main purposes of inspection to be to:

“ Give an independent public account of the quality of education and training, the standards achieved and the efficiency with which resources are managed. Help bring about improvement by identifying strengths and weaknesses and highlighting good practice. Keep the secretary of State, the Learning and Skills Council for England and the Employment Services informed about the quality and standards of education and training. Promote a culture of self assessment among providers, leading to continuous improvement or maintenance of a very high quality and standards”, (OfSTED, 2001a)

In terms of what is evaluated and reported:

“Inspectors will focus primarily on the experiences and expectations of individual learners through the evaluation, as applicable of what is achieved, the standards reached and learner’s achievements, taking into account their prior attainment and intended learning goals. The quality of teaching, training and learning. Other aspects that contribute to the standards achieved such as the range, planning and content of courses or programmes; resources and the support for individual learners. The effectiveness, with which the provision is managed, its quality assured and improved, and how effectively resources are used to ensure that the provision gives value for money. The extent to which provision is educationally and socially inclusive, and promotes equality of access to education and training including provision for learners with learning difficulties or disabilities” (OfSTED, 2001a)

The response of the Government and the LSC to poor inspection grades like some of those recorded in the pilot inspections was that providers identified as causing concern would be subject to rigorous performance reviews (LSC, 2001b). The aims of performance review would be to:

“Assess provision and help ensure that all provision is of at least an acceptable standard. Confirm that there is appropriate observance of the LSC’s responsibility in respect of Health and Safety and equality and diversity legislation. Take regular stock of provider’s performance and identify any action required to bring about improvement. Promote continuous improvement and raise standards. Identify excellence so it can be recognised and disseminated across post 16 provision. Identify concerns, take action to strengthen provision in difficulty and ultimately apply sanctions if necessary. Inform the process of strategic planning across post 16 provision.” (LSC, 2001b).

The sanctions that might be applied to failing curriculum have been suggested to be the removal of the funding for the curriculum and transfer of students to other providers, or the replacement of management with new managers (DfEE, 1999). The literature would suggest that the LSC, OfSTED and ALI are intending to take on an aggressive stance towards the measurement of the quality of curriculum and this was confirmed by a Chief Executive of a LLSC. He said:

“Well I think the first thing I would like to say is that ultimately it is for the colleges to manage quality. I think it’s very important that the LSC doesn’t take away that responsibility from colleges at local level. We’re close, but we are not the colleges themselves. They have to be concerned about quality, not least since we cannot afford for people to be turned off education, by not getting a good experience, because often if people have a poor experience, then they don’t come back to learning. We’ve worked hard at trying to get people back into learning, who perhaps haven’t been in learning for along time. It is therefore important, that when they do get that experience, that it’s a good one, a powerful one, a positive one, that will make them want to be part of lifelong learning and not make them want to run a mile.”

He continued:

“Our approach, so that’s the first point I want to make about quality. The Second point I’d make about quality is that it is absolutely the heart of everything that we do, and the discussion we’ve already had this morning. I think I’ve mentioned the importance of quality and the different ways quite frequently. And we have this quarterly process, of assessing how colleges and providers, all colleges and providers are doing, and it is proving quite a powerful tool. Letters went out from here last week, to all of the providers for whom there was any concern at all, and is therefore making good quality much, much higher up the agenda. But as the letters have gone out say: “Look, this is our assessment, we want to work with you, in order to remedy these problems, by the next time we have a report”. I think therefore quality really underpins the relationship that we have with the providers and colleges; it feels like, well yes how we actually manage. Well, we’ll manage it by putting quality on every agenda, at every meeting we have with them really.”

From the point of view of the Government however, in addition to quality of FE provision, the concept of cost effectiveness is also an important factor, a point that was emphasised by the DfES Executive:

“Well the absolute basis for quality initiatives is cost effectiveness. Now this sounds like the opposite and in some ways it is, but if you’ve actually looking to save money, or to be more cost effective, the easiest way for any provider of anything, is to save money, is to reduce the quality of the product. So if you’re actually intent on being cost effective, you have to introduce quality controls simultaneously, to make sure one is not traded off against another. So when, and that’s been established now in education for some time, and the development of the inspectorate hand in hand with the development of the funding methodology. If you look at a situation where a new Government provides further money for education. What they don’t want to do, is to simply say, well, we’ll increase everything by five per cent pro rata when quality is a major issue, and the money is therefore parcelled up, based on an improvement in quality.”

In measuring cost effectiveness and quality initiatives, the Government would appear to favour quite mechanistic measurement tools, with an emphasis on statistics rather than on the individual stories of the students. A college manager described his experiences of the tension between quantitative measurement of quality and a more qualitative approach which OfSTED appeared (at least on paper) to prefer (OfSTED, 2001a). The college manager concluded:

“I think one of the problems is that the whole of the sector has just come back to the curriculum thing, [it’s] lost curriculum focus I think and it’s got really, where it is driven by mechanistic systems really. It’s driven by all sorts of measuring devices and fairly crude indicators. I think there’s a tension. There seems to be tensions all throughout the system. I think that’s going to continue in the kind of LSC mode, LSC philosophy. But on the other hand, there is an increasing emphasis through things like QAA, and the student experience.”