Study support programmes: What is the research evidence for effective practice?

Keith Mason – National Foundation for Educational Research in England and Wales

Paper presented at the European Conference on Educational Research, Edinburgh, 20–23 September 2000

Abstract

Study support throughout the UK is currently booming. Many schools in England, Wales, Northern Ireland and Scotland are looking to set up study support programmes or extend existing ones using funding that is now available from the New Opportunities Fund (NOF) of the National Lottery. Since 1997, the National Foundation for Educational Research (NFER) has conducted several pieces of research into study support initiatives. This work has ranged from carrying out surveys to audit provision in schools, to evaluating new study support initiatives operating either during term-time or the summer holiday. So how are study support schemes being monitored and evaluated with regard to the impact made on participants? This paper highlights the quantitative and qualitative methods used to monitor and evaluate the effectiveness of schemes, ranging from those methods employed by individual schools to instruments used with national initiatives. What is being measured, what should be measured, and is study support making any difference?

Introduction

In the UK, most schools have a long tradition of providing their pupils with out-of-school-hours learning activities. The most common types of such activities are team sports, creative arts (particularly drama and music), homework clubs and examination revision sessions. In recent years, the Government has placed a new emphasis on this type of activity under the banner of ‘study support’. The concept of study support is wide-ranging and encompasses many different types of activity, but the concept differs from that of earlier years. A recent Department for Education and Employment (DfEE) publication defined study support as follows:

Study support is learning activity outside normal lessons that young people take part in voluntarily. Study support is, accordingly an inclusive term, embracing many activities – and with many guises. Its purpose is to improve young people’s motivation, build their self-esteem and help them become more effective learners. Above all, it aims to raise achievement.

(GB. DfEE, 1998)

The main difference in the concept of study support is that in place of a somewhat ad hoc approach to provision, dependent on the goodwill of individual staff members, the emphasis is now on providing a coherent programme of activities that address pupils’ needs and supports classroom work. The target of the UK Government is to establish study support learning opportunities in at least half of all secondary and special schools and a quarter of all primary schools by 2001. The expansion is to be funded partly through the New Opportunities Fund (NOF) of the National Lottery. Since April 1999, NOF has awarded grants totalling over £40 million into study support projects in England, Wales, Northern Ireland and Scotland. A particular focus is on disadvantaged areas with schools working with partners from the voluntary, public and private sectors. In addition, the Department for Education and Employment is providing £20 million this year for study support activities and a further £60 million in 2001 from the Standards Fund.

One of largest growth areas England has seen in study support since 1997 has been in the provision of summer schools of various types (Summer Literacy Schools, Summer Numeracy Schools, Study Support Summer Schools, Gifted & Talented Pupils Summer Schools, Higher Education Summer Schools). In fact, the Summer Literacy Schools initiative was one of the first educational initiatives of the UK Government elected in May 1997. The number of summer schools in Wales, Northern Ireland and Scotland is now beginning to increase dramatically.

Whether funded through the New Opportunities Fund, the DfEE’s Standards Fund or schools own budgets or donations/sponsorships, a survey this year indicates that study support activities are provided by 97 per cent of schools in England, and provision has increased in over two-thirds of schools in the last two years. A typical primary-aged pupil is spending nearly two hours a week on such activities, rising to three hours when they reach secondary school, and over half of all schools are planning to introduce further activities.

Study support activities include:

·  homework clubs and revision sessions

·  help with key skills

·  sports, games and outdoor activities

·  creative arts (e.g. music, dance, drama)

·  clubs enriching or extending curriculum subjects

·  opportunities to pursue particular interests (e.g. archaeology, languages)

·  opportunities for voluntary activities and community service

·  residential events, such as study weeks

·  mentoring by adults or pupils

·  learning about learning (thinking skills, accelerated learning)

Sine 1997, the National Foundation for Educational Research (NFER) has conducted several major pieces of research into study support. This work has ranged from carrying out surveys to audit provision in schools, to evaluating new study support initiatives operating either during term-time or the summer holiday. The NFER is currently evaluating the New Opportunities Fund out-of-school-hours learning programme in England, Wales, Northern Ireland and Scotland; a programme that continues for another three years.

Aims of study support projects

The aims of projects, whether involving single schools or groups of schools, range from the very general to the very specific. Typically, projects aim to raise levels of achievement; extend and enrich the curriculum; provide specialist facilities and a safe learning environment, and raise pupils’ confidence and self-esteem.

Aims relating to academic achievement often refer to key skills in literacy, numeracy and Information and Communications Technology (ICT), and performance in national examinations and National Curriculum tests. Many projects focus specifically on ICT skills or use ICT within a broader range of activities.

Projects wishing to extend or enrich the school curriculum do so by introducing new topics and subjects, and by including non-academic subjects. Some projects target the ways pupils work, with aims designed to encourage collaborative learning, sometimes across year groups or between schools, independent learning and study skills.

Some projects focus on widening access to support and facilities for children and young people from deprived areas. The emphasis here is on promoting equality of opportunity by providing an appropriate place, time, specialist facilities and support to enable pupils to complete their homework and coursework. Many such projects are based at non-school sites such as youth centres and community libraries. Disaffection with school is addressed by aims which seek to improve motivation, confidence, self-esteem, attitudes to learning, aspirations and expectations. Many of these study support projects employ professionals other than teachers to provide an input into activities, such as youth workers and local business employees working as mentors.

A common type of project throughout the UK is one that involves a partnership of a secondary school with a number of its feeder primary schools. The principal aim of these projects is to help ease the transfer of pupils about to join the secondary school by providing a number of ‘taster’ activities at the secondary school.

Monitoring and evaluation methods

The monitoring and evaluation methods used by individual study support projects naturally take into account various features of the project as well as the intended aims. Some of these projects may be very small involving perhaps around ten pupils, whereas others may be large involving several hundred young people. The range of qualitative and quantitative methods used to monitor and evaluate effectiveness of projects include:

·  using attendance data

·  using performance data

·  using attitudinal data

·  participant review/self-assessment

·  provider review

·  parent review

·  provider assessment of product

·  measurement against targets

·  tracking pupils throughout their schooling

The wider picture

In England, Summer Literacy Schools were introduced as a pilot scheme in the summer of 1997 in an attempt to improve pupils’ literacy skills at the age of 11, the time of transfer from primary to secondary education. The pupils involved were those who did not meet the national standard of level 4 in English, and so were in the lowest 30 per cent or so of pupils in their year group. The initiative was evaluated by collecting pupil results from a national test taken in May and comparing these with the results of a similar test administered in September. The two tests were equated so as to be on the same metric, with the total marks available for each test being 50. Results for a control group who had not attended summer schools were also analysed.

Table 1 Mean equated scores for pre- and post-test of Summer Literacy

Schools and control group in 1997

Pre-test
Mean SD / Post-test
Mean SD / Difference
Mean SD
Summer schools / 17.2 8.1 / 14.2 9.0 / -3.0 6.9
Control group / 20.4 9.8 / 17.4 10.5 / -2.9 7.0

The analysis revealed that the scores of both groups (the summer schools group and the control group) declined significantly between the pre-test and the post-test. There was no significant difference between the two groups in the extent of the decline. It was apparent that the Summer Literacy Schools had no impact in the short-term on children’s reading skills. The ‘quick-fix’ solution of providing an additional 50 hours of literacy tuition condensed into a two-week period during the summer holiday could not be considered successful in raising standards of reading for these children. There is evidence beginning to emerge from other study support initiatives that a significant impact can be made on performance through 50 or so additional hours, but it appears that this amount of time needs to be over an extended period.

However, the NFER evaluation of Summer Literacy Schools also looked at pupils’ attitudes to reading and specific reading skills, and it was found that attitudes to reading had improved significantly. An argument may be made that positive changes in pupils’ attitudes to reading, or to any curriculum subject for that matter, may lead to significant changes in performance in several years time. As yet there has not been a longitudinal study looking at patterns of performance, in relation to marked changes in pupil attitude through study support provision, to give credence to or to refute this argument.

The impact of study support initiatives on pupils’ attitudes, within specific subjects and to learning in general, has subsequently received a greater emphasis in initiatives since that time. For instance, Study Support Summer Schools held in 1999 as a pilot scheme essentially to inform the New Opportunities Fund out-of-school-hours learning programme used pre- and post-course attitude questionnaire with participants as part of the evaluation conducted by NFER.

Each questionnaire was divided into four main sections, plus a further section designed to gather information about individual participants and open-ended responses about the summer school. The main sections were:

Section 1 About Reading 8 statements

Section 2 About Maths 8 statements

Section 3 About Your School Work 11 statements

Section 4 About the Summer School 5 statements

Responses were sought to the 32 statements in each questionnaire through the use of a five-point scale: strongly agree, agree, not sure, disagree, and strongly disagree. The same statements were asked in the pre- and post-course questionnaires (with slight amendments for the post-course questionnaire to make the statements relevant). A total of 1,409 pupils completed both pre- and post-course questionnaires in 22 summer schools.

Mean scores for each statement for the whole sample, and for groups of participants, were calculated by assigning +2 to ‘strongly agree’ responses, +1 to ‘agree’ responses, 0 to ‘not sure’ responses, -1 to ‘disagree’ responses and -2 to ‘strongly disagree’ responses. The difference between the pre-course mean and the post-course mean is regarded as the impact of the summer schools with regard to that particular statement. The level of significance between each pair of means was calculated using paired-sample t-tests. Analyses were carried out for the whole sample, and for groups of participants (e.g. males and females separately). Table 2 below shows the three statements in the About Reading section that showed a significant difference between pre- and post-course mean score for all participants.

Table 2 All participants – ‘About Reading’ statements that showed a

significant difference between pre- and post-course mean score

STATEMENT

/ PRE-COURSE MEAN SCORE / POST-COURSE MEAN SCORE
I like reading stories.
(N = 1408) / 0.83 / 0.90
I think I am a good reader.
(N = 1398) / 0.85 / 0.94
I like watching TV better than reading books. (N = 1398) / 0.71 / 0.49

In order to obtain a clearer picture of the questionnaire outcomes, it was necessary to gather together statements addressing related attitudes. This was done by means of a factor analysis. Factor analysis is a statistical technique that seeks out groups of related statements by identifying patterns of similar response and replaces them with a smaller number of new variables or factors. A factor analysis was performed on each of the four main sections of the questionnaires, each analysis using the combined pre-course and post-course results. This enabled factors to be extracted, and then factor scores to be computed. The factor scores for each analysis together have a mean of 0 and a standard deviation (SD) of 1. The differences between factor scores show the relative changes in attitudes between specific groups pre-course to post-course. Negative factor scores do not indicate negative attitudes; rather they indicate that the overall attitude of the group of students was lower than that of all participants. Tables 3 and 4 below show the statements that group into the Usefulness of School Work factor and pre-and post-course means and standard deviations on this factor by gender respectively.

Table 3 Statements that group into Usefulness of School Work factor

Factor: USEFULNESS OF SCHOOL WORK

On the whole school work is worth doing.
Last term I was excited about learning lots of new things.
I made a big effort to do well in lessons last term.
Last term’s school work was really interesting.
Last term I did really well with my school work.
(NB Only pre-course statements given here.)

Table 4 Gender – Pre- and Post-course Means and Standard Deviations

for Factor Usefulness of School Work

FACTOR
Usefulness of School Work / Pre-course / Post-course / Significant
at 5% level
Mean / SD / Mean / SD
Males / -0.16 / 0.99 / 0.25 / 1.02 / Yes
Females / -0.21 / 0.98 / 0.13 / 0.95 / Yes

Local projects