How do teachers respond to data?

Pete Dudley, Senior Adviser, (School Development)

Essex Learning Services Directorate

Paper presented at the British Educational Research Association Annual Conference

(September 11-14 1997: University of York)

Draft not to be quoted without the author’s permission

Correspondence:

Pete Dudley

Senior Adviser, School Development,

Essex School Improvement and Learning Support Service,

PO Box 47, A Block, County Hall,

Chelmsford, CM1 1DT

If primary schools ever were data free zones, the recent proliferation of data collection imposed by national curriculum requirements or self generated in the earnest pursuit of improvement has ensured that many schools are rapidly approaching data saturation point. Understanding how teachers respond to data and the possible messages data convey are vital if programmes such as EPSI are to be carried forward.

Why can pertinent, well analysed and effectively presented school data trigger sustained self improvement in one school and result in indifference, paralysis or burial in another? This paper examines the initial responses of teachers to pupil perception data and identifies ways in which those working with schools towards self improvement can help promote reflection and action.

Central to the success of the strategies proposed in ‘Excellence in Schools’ (DfEE, 1997) is the ability of schools to use pupil data to boost achievement.

‘The use within a school of reliable and consistent performance analyses enables teachers to assess progress by their pupils and to change their teaching strategies accordingly.’ (p. 27)

The white paper indicates that a key measure of an LEA’s effectiveness will be its ability to:

‘provide clear performance data that can be readily used by schools’(p.27)

to set targets for improvement in pupil achievement.

This paper draws on experience from work with schools in the EPSI programme underscoring the importance of pupil data and local comparative benchmarks in promoting change in teaching strategies. However, it challenges two major assumptions made by the white paper. Firstly, that ‘performance’ data are always the most effective data to impact upon strategies and secondly, and most critically, that provision of ‘clear performance data’ alone will be sufficient to trigger changes in teaching.

Are performance data always the most effective data in promoting changes in teaching strategies?

It is clear already from the EPSI programme that primary schools are making increased use of pupil data in planning evidence based improvement (Southworth, 1997), It is clear, furthermore, that pupil perception data has played a critical role in challenging and changing teacher perspectives of their pupils’ learning experiences, (Sebba and Loose, 1997). One EPSI research school warned the EPSI programme pair at the outset that attitudes would be hard to change:

‘..most of us have been here since the school opened 30 years ago. .we’re a very experienced staff .. you’ll find people quite resistant to change..’.

But some time after encountering pupil perceptions gathered through interview (See appendix) by the programme pair, a senior member of staff reflected:

‘I was professionally hurt by their perception that they have to get through the boring work in order to get on to the interesting stuff’.

This senior member of staff subsequently played a lead role in attempting to challenge attitudes to change in classroom practice.

It is not clear whether pupils’ perceptions of their learning have played such an important role in triggering change in EPSI schools because they are a new form of ‘data’ which many teachers have never before confronted in such a formal way, whether it is because in viewing the programme as ‘research’ they feel they will be involved and beneficiaries[1], or whether it is because teachers feel particularly close to pupil perception data. They are instinctively aware of the power of affective factors in motivating learning and achievement, perceive them as something they can and need to respond to readily in changing the way pupils feel about their learning in order to increase motivation through what Pollard (1996) has described as better

‘structuring of affective and intellectual support in the zone of proximal development’ (Pollard, 1996 p.97). (My underline)

It is clear that pupil perception information is an important data set. Where such data can be gathered and fed back to schools by external consultants, the data seem to play an important role in triggering change. This is, however, an expensive method of gathering such data and often gathers only the perceptions of a small sample of pupils.

Currently the only formal processes which systematically require the gathering and use of pupil perception data are Ofsted inspections (Ofsted 1995, p 60) and Special Educational Needs Annual Reviews (DfE 1994) . Ofsted inspections happen only once every four to six years, yet if pupil perceptions can provide such triggers and insights into school improvement, they need monitoring more frequently and in line with school improvement and development cycles. Inspectors use pupil perceptions to inform their judgements but are not required to report on pupil perception data as a distinct category in the way parent perceptions are gathered, analysed and reported.

The Special Educational Needs annual review process samples the perceptions of only a small percentage of pupils. Neither of these processes provides any comparative data for schools.

The question must be then, whether LEAs, whose task it is ‘to challenge schools to raise standards continuously and to apply pressure where they do not’ (DfEE 1997 p. 27), need to find ways of gathering analysing and providing pupil perception data as one strand of the ‘clear performance data that can be readily used by schools’ (p 27).

Other approaches to pupil perception data

Schools in the wider EPSI research network have not had the same level of programme pair support as have the research partner schools. Consequently, they have gathered pupil perceptions from whole cohorts using a survey method (Dudley 1996, See appendix 2). The survey is similar to those used by Keele University (Keele 1994) and the ISEP project (ISEP 1996, Thomas 1996) although the learning perception survey is played to year two pupils on an audio tape in order to access pupils at an early stage of reading development, to the written items.

Such surveys are becoming widely used in the secondary phase:

‘These instruments have now been used in a wide range of schools nationwide.. There is a growing demand for them across the country.. (Barber, M. et al. 1994. p. 8).

The Learning Perception Survey provides schools with comparative data from other schools whose identities are coded. Thomas (1996) has indicated that provision of certain comparative pupil perception data may prove fruitful (p. 8) The intention in Essex has been to promote the potential for schools to ask the question ‘How are we doing in comparison with similar schools?’ (DfEE 1996) in terms of pupil perceptions and consequently to promote use of the data to promote change, as Barber et al. observed of the Keele Survey of Pupil Perceptions of School Life (Keele 1994)

‘Schools which participate in the surveys are able to see how their school relates to national averages....These results enable schools to refine their improvement strategies and target resources appropriately. (Barber, M. et al 1994 p. 8)’

However, a study into how teachers initially respond to such survey information (Dudley, 1997) highlights the difference in teacher responses where the perception data are generated by an impersonal instrument such as a survey. It also raises more general issues over how teachers respond to quantitative data analyses which may prove important in developing the capacity of LEAs and schools to use pupil data effectively in school improvement.

Will provision of clear performance data alone be sufficient to trigger changes in teaching?

First impressions are powerful. First impression reactions of groups of teachers to Learning Perception Survey data in four schools were recorded as they read the data in groups for the first time. Transcripts were analysed to ascertain the degree to which the data might begin to trigger speculation or reflection which might later result in further investigation of an issue and changes in practice.

Good news data and bad news data

Although there was often a clear pattern in pupil responses to items, each school had positive responses by the pupils (good news) as well as negative responses (bad news) both in terms of the percentage of pupils responding negatively to an item as well as in terms of the response of pupils in one school to an item in comparison with those of pupils of other schools to the same item.

The teachers’ responses to the data from each item fell into three phases:

1. Initial reaction (emotional)

‘Now look at that, that’s great, and again we’ve come out tops’

‘Oh dear’..’ I mean I feel from my point of view that that’s quite devastating’

2. Judging (sizing up) reaction

‘Mmm.. let’s compare it to..’

‘It’s interesting, you know, the differences between the females and the males..’

3. Initial determination

‘and also, I mean, very strongly the boys feel we don’t listen to them’

‘because if you’re going to do this then it’s got to inform how you behave withthem’

My original assumption had been that ‘good news’ prompting a positive emotional reaction would be much more likely to prompt an acceptance of the issue and bad news would tend to prompt a rejection. In following this line of enquiry four response categories were initially identified from the data. They were characterised by the ways in which critically or uncritically they combined acceptance or rejection of the issue behind the data, with an action oriented outcome to the discussion.

1. active critical acceptance of issue behind data / 3. passive uncritical rejection of issue within data
2. passive uncritical acceptance of issue in data / 4. active, critical rejection of issue within data

The fifth (See ‘Pending’ below) - was added to further clarify active but less critical acceptance.

For the provision of data to schools to be a valid activity it must above all have validity of ‘consequence’ (Messick, 1989, p. 10) - it must be valid in the sense that people should and do act upon the basis of the activity. The initial responses of the teachers to the pupil data in terms of the degree to which it provoked action oriented responses fell into five categories. The categories were given names which likened them to ways in which items of mail - good news or bad news - might initially be sorted. They were:

1. ACTION RESPONSE

2. PENDING

3. FILE

4. BIN

5. RETURN TO SENDER

1. Action response(19 examples)

This category is associated almost equally with ‘good news data’ and ‘bad news’ data. Items which fell into this category were often discussed at length. The implication behind the data is put under pressure and challenged. Past and current experience are brought to bear on the discussion and hypotheses are raised as to it’s meaning.

Consider, debate, reflect

‘I would have expected more to say ‘Yes I do worry’ than have come out here...I had quite a few children who said ‘Oh they’re being nasty because my work was too good’

Compare with other information and suggest strategies

Where I was before they used to have a sharing book as well as a reading book so that every night each child had a book to read in bed and that’s something we can..’

‘it would be interesting to find out what strategy [another school in the chart is using] .... it would be interesting to find out exactly what they were doing next wouldn’t it’

‘it would be very interesting to compare it to..I wonder what they’re offering that we’re not?

The eventual outcome gives an indication that the matter will not rest there and some tangible notion of how the matter may be taken up will emerge through the discussion. The decision may be to make further investigation in school, to find out about another school or to make changes in their own school. Clearly, it cannot be assumed that indicating an intention at this stage will necessarily result in action but it is more likely to do so than where no intention has resulted.

2. Pending(19 examples)

Items in this category are also discussed at some length and subject to similar reflection, pressure and challenge. It results three times more frequently from good news data than from bad news data.

Consider, debate, reflect

‘I would have expected more to say ‘Yes I do worry’ than have come out here...I had quite a few children who said ‘Oh they’re being nasty because my work was too good’

Quite a few of the ones I picked up, they’d put that they didn’t like their work being shown because they felt nervous I suppose of the reaction of children. So I mean it’s something like with the ethos of the school you can really work on can’t you....I suppose really you’re never going to get the situation where they feel totally comfortable. I mean if you imagine yourself in a situation with a group of people you knew really well you’d still have that little feeling that maybe someone is thinking this is a bit silly or whatever..

Compare - but stop short of suggesting strategies

‘I wonder what impression they do get then. Do we seem terribly busy to them? If they say that only half of them - about 50% of them that we don’t have time to listen to their questions ...... J What’s on the yellow (comparative) sheet. That’s interesting isn’t it. Makes you stop and think doesn’t it.

The issue behind the result is accepted but there is a less obviously action-oriented outcome to the discussion. The matter is left unresolved but definitely features as an issue in the context of the whole discussion.

3. File (29 examples)

The file category is by far the most common and characterised by the acceptance of the issue behind the result without any real debate. In around 80% of examples it will result from ‘good news data’ and is only associated with bad news data in 10%. There is a very small percentage of low scores associated with this category. (See Chart 1)

Welcome or accept without reflection, debate or question

- ‘That’s good’.

- ‘and look at the one at the back as well’. (comparative page)

- ‘Mmmmm’.

The item may only be discussed briefly or cursory attention be paid to it. It does not provoke much reflection, hypothesis or explanation. In some instances the discussion does not progress beyond the judging reaction phase when the group begin to ‘flick’ through the data quite quickly. On other occasions when the news has not been so welcome, the data may come close to being rejected or only grudgingly accepted with a rationalising caveat. Another feature of the ‘file’ category is the potential to let the data mould one’s view of the situation - to acclimatise to it - rather than to view it critically.

4. Bin(8 examples)

There is a greater association with bad news data than in the earlier three categories. The discussion tends to be short. The message behind the data is not accepted. The emotional reaction tends to be neutral or ‘not surprised’ - finding the outcome predictable.

Generalisation or rationalisation

- ‘That doesn’t surprise me really. Does it you?

- ‘No.

- It doesn’t really. I don't imagine they think ‘Oh that’s interesting. When I get to school tomorrow I will tell...’

- No

- I don’t think they would’

Rejection of method

- It just reflects the time of year that it [the survey] was done.

The discussion may result in generalisation, rationalisation or a rejection of the method used to gather the data.

5. Return to sender(9 examples)

This occurred infrequently. It is also associated with bad news data more than good news data.

Questioning the value of pupil perceptions (even though they have been accepted on most other items)

- Because I wonder if children can really be objective about when they get told off or when somebody else is getting told off? I know, you know..they’re possibly..

- You think they’re just a bit too young maybe?

The category is characterised by a more sustained debate of the issue behind the data than in either the bin or the file category, but results in either an active dismissal or denial of the issue. This is can be attributed to:

- factors perceived to be beyond the range of the teachers’ or school’s control,

- to a questioning of the value of pupil perceptions or feelings on this matter,

- or else to some aspect of the method of the survey.

Chart 1. above sets out the incidence of emotional responses in the emotional reaction discussion phase in each of the five response categories. Happy and unhappy were mutually exclusive as were surprise or feeling that the data were predictable.

Significantly:

1.an action response is only ten percent more likely to be generated by good news data than by bad news data;

2.action responses are not associated particularly strongly either with surprises or with data felt to be predictable

3.the message behind the data is ‘filed away’ when it is perceived as good news and not challenged or reflected upon;

4.data are likely to be ‘binned’ when they are seen as predictable bad news.

Where points one and two perhaps have positive implications for the development of the use of pupil data, points three and four present problems. Point 4 is unsurprising but highlights the need for strategies to help create a positive action response to predictable bad news data. Point 3 raises the difficulty of generating a positive action oriented response to improving on a situation which is perceived as not being problematic. It is natural to look for problems within data but if the notion of teacher as researcher is to become a reality, critical examination of what seems to work must become as much second nature as investigating what seems to be going wrong.

Factors associated with action oriented responses.

A number of common issues emerged from the data which were examined firstly to establish their precise nature and secondly to find out if they contributed to, or at least were associated with, the likelihood of response items to be responded to critically and positively or to be binned. These were: