4. Analysis of quantitative data

4.1Project classes and ‘controls’

A summary of modes of working of the project classes and ‘controls’ is shown in Table 4.1. The four project classes are shown in bold (A, B, C, D)

School / Class / Y10 Exams / Work / Y11 Exams / Work
A / A / GCSE / AS
B / B, Bx / - / GCSE + FSMQ
C / C, Cx / GCSE / FSMQ and some resit
D / D / GCSE + Extra topics
‘Control’ / a, b, c / - / GCSE
d / GCSE / FSMQ or AS or resit

Table 4.1: Modes of examination and work (‘pathways’)

In the four project schools there were in allsix classes from which data were collected –the four project teachers’ classes (A, B, C, D) plus two parallel classes (Bx, Cx) whose teachers were not part of the project and did not attend any project meetings, but were following the same pathway. However, there was – quite naturally – interaction between those two teachers and their corresponding project colleagues. To a very limited extent these classes might be seen as ‘controls’.

An additional school was contacted before the project began and agreed to administer the Student Questionnaires to their four Y11 ‘top sets’ (a, b, c, d), one of which (d) had entered GCSE the previous year and were in Y11 following a mixture of AS, FSMQ and resits. Once again, to a very limited extent these classes might be seen as ‘controls’.

Where possible, data was collected from the schools about the GCSE performance of any parallel classes, and about the previous year’s results of comparable classes. This proved surprisingly difficult to obtain from two schools, partly because the project teachers did not have the information available and did not find the time to seek it out. Eventually it was obtained, through the head of department in one case.

4.2 Note on drawing inferences from the data analysis

It would be unwise to attempt to make definitive comparisons or draw inferences for the wider population from the data from this project. Each school, each teacher, each class and each pathway was different in many ways. Although in two of the schools there was a parallel class they could hardly be seen as independent ‘controls’. Likewise, although a large amount data was gathered from four classes in a school outside the project it would be wrong to claim this to be a real ‘control’ school.

Resultswhich may appear significant, and indeed which would be calculated to be statistically significant (e.g. using a multi-dimensional chi-square test on cross-tabulated data) might well not generalise to a wider population or to other circumstances. There are just too many uncontrollable, unquantifiable and maybe unknown variables. For this reason we have avoided reporting statistical significance. Our best hope is that teachers in other schools where circumstances seem similar to those in our project schools will feel able to draw their own tentative conclusions and then conduct their own ‘experiments’ to verify them.

In summary, we present the statistical data we have collected and the simple analyses we have performed. The implications must remain speculative (and we do make some speculations later.)

4.3 Data on Sets, Gender and SATS levels

Class A was mathematics set 1 out of 10 linear sets taught simultaneously. This unusual arrangement was made possible by the school having several senior staff who were mathematicians.

Classes B and Bx were each the top set in one of two genuinely equal parallel streams, each having four classes. They were taught by two different teachers.

Classes C and Cx were each the top set in one of two genuinely equal parallel streams, each having four classes. They were taught by two different teachers.

Class D was the top set in one of three parallel streams,each having four classes. However these three streams were not entirely comparable. In terms of overall academic ability (i.e. not specifically mathematical) class D was a stream a little below one stream and a little above the other.

Classes a, b, c, d were the top sets in each of four parallel streams,each having four classes. However these four streams were by no means comparable. In terms of mathematical ability as judged by SATS level, class d was set 1 of 16 (see Table 4.2). Class b can be ranked 2 of 16 and classes a and c were broadly equal at (say) 4 of 16. Of course, lower sets usually have fewer students in them than do higher sets so a top set rated 1 of 4 will have rather more than ¼ of all students.

School / Teacher (Class) / Set / No / Male
% / Female
% / SATS 8
% / SATS 7
% / SATS 6
% / SATS n/a
%
A / A / 1/10 / 29 / 59 / 41 / 76 / 24 / - / -
B / B / 1/4 / 28 / 50 / 50 / 25 / 64 / - / 11
Bx / 1/4 / 30 / 50 / 50 / 30 / 53 / 3 / 13
C / C / 1/4 / 29 / 55 / 45 / 38 / 59 / 3 / -
Cx / 1/4 / 28 / 46 / 54 / 29 / 61 / 11 / -
D / D / 1/4 / 29 / 66 / 34 / 28 / 72 / - / -
‘Control’ / a / 4/16 / 28 / 64 / 36 / - / 82 / 18 / -
b / 2/16 / 28 / 50 / 50 / - / 100 / - / -
c / 4/16 / 25 / 52 / 48 / 0 / 64 / 36 / -
d / 1/16 / 28 / 54 / 46 / 68 / 32 / - / -

Table 4.2: Maths Setting, Gender (%) and SATSlevel (%)

Table 4.2 shows that class A had much higher SATS levels overall than all the other project classes, as would be expected of a set 1 out of 10 (who had taken GCSE in Y10 and were studying AS Mathematics in Y11). ‘Control’ school class d was broadly comparable to class A, having similar SATS level and Y10 GCSE grade profiles (see Tables 4.2 and 4.3). The other classes in the project schools had quite similar SATS profiles to each other.

Judging by SATS level profiles, the ‘control’ school classes a, b and c do not seem, individually, to be directly comparable to any of the project school classes.

The gender balance was quite even in all classes except A (59% male) and D (66% male).

The raw data underlying Table 4.2 are presented in Appendix G, Table G1.

4.4 Examination results

Table 4.3 shows the GCSE mathematics results, where known for individual students in the database). Those for A, C, Cx and d were obtained when in Y10.

School / Teacher
(Class) / Set / No / GCSE
A* / GCSE
A / GCSE
B / GCSE
C / GCSE
D
A / A / 1/10 / 29 / 45 / 52 / 3 / - / -
B / B / 1/4 / 22 / 23 / 45 / 23 / 9 / -
Bx / 1/4 / 28 / 18 / 54 / 25 / 4 / -
C / C / 1/4 / 29 / 7 / 24 / 45 / 24 / -
Cx / 1/4 / 28 / 7 / 32 / 50 / 11 / -
D / D / 1/4 / 29 / - / 21 / 59 / 21 / -
‘Control’ / a / 4/16 / 27 / - / - / 33 / 63 / 4
b / 2/16 / 27 / - / 15 / 37 / 41 / 7
c / 4/16 / 25 / - / 4 / 8 / 80 / 8
d / 1/16 / 28 / 47 / 32 / 21 / - / -

Table 4.3a: GCSE grades of students for whom individual results known (%)

The raw data underlying Table 4.3 are presented in Appendix G, Tables G2a and G2b.

For schools A, C and D we have GCSE data for every individual student in the database. However, for school B data is missing (due to coding problems at the school) preventing complete matching up of students in the database with their results (numbers matched were B: 22 of 28; Bx: 28 of 30). This explains the lower figures in the ‘No’ column in Table 4.3 compared to Table 4.2. Table 3b below gives the fuller overall results for class B.

School / Teacher
(Class) / Set / No / GCSE
A* / GCSE
A / GCSE
B / GCSE
C / GCSE
D
B / B / 1/4 / 28 / 18 / 43 / 25 / 14 / -

Table 4.3b: GCSE grades of students class B (%)

Table 4.4 shows the very poor results for the FSMQ Additional Mathematics examination.

School / Teacher
(Class) / Entry / FSMQ
A / FSMQ
B / FSMQ
C / FSMQ
D / FSMQ
E / FSMQ
U/X
B / B / 22 / 1 / - / 1 / - / 1 / 19
Bx / 28 / - / - / 1 / 3 / 1 / 23
C / C / 13 / 1 / - / 1 / - / 2 / 9
Cx / 24 / 1 / - / 1 / - / 3 / 19
‘Control’ / d / 10 / - / - / - / 1 / - / 9

Table 4.4: FSMQ grades (frequencies)

Some of the reasons for this are detailed in Chapter 9 (findings and conclusions).

4.5 Student progression from Y11 to Y12

One key purpose of this project was to identify ways to encourage students to continue to study mathematics beyond Y11. Whilst progression into Y12 is relatively easy to measure (at least approximately), it is harder to assess affirmatively whether there has been any real improvement in the progression rate, and harder still (if not impossible) to determine whether any improvement is due to activities relating specifically to the project work. One difficulty is that students transfer schools, another is that sometimes a student will leave and later return, and a further complication is that some begin AS and then drop out. This all means that numbers fluctuate and can be hard to pin down.

First, in Table 4.5, we look at student’s intentions and the actual reality, and then we turn to investigate whether progression improved for any of the project classes.

School / Teacher
(Class) / Set / No / Initial Prediction % / Final Prediction % / Actual
Uptake %
A / A / 1/10 / 29 / 83 / 77 / 62
B / B / 1/4 / 22 / 64 / 75 / 64
Bx / 1/4 / 25 / 82 / 88 / 71
C / C / 1/4 / 28 / 43 / 28 / 24
Cx / 1/4 / 28 / 43 / 37 / 36
D / D / 1/4 / 29 / 69 / 62 / 45
‘Control’ / a / 4/16 / 27 / 37 / 21 / 11
b / 2/16 / 27 / 48 / 33 / 18
c / 4/16 / 25 / 39 / 30 / 4
d / 1/16 / 28 / 79 / 82 / 68

Table 4.5: Students’ predictedand actual progression rates (%) from Y11 to Y12

Table 4.5 shows students’ response rates to the question in “Do you intend to continue studying Maths in Y12?” which wasposed in the Student Initial Questionnaire administered in October 2008 and in the Student Final Questionnaire administered in April 2009 (see Appendix B for more details). It also showsthe actual uptake rate measured in October 2009. For most this is studying AS mathematics, for some A2 Mathematics. In almost all classes this shows a decline of uptake expectations between October 2008 and April 2009, and a further decline in actual uptake (after drop-outs). Having the additional ‘control’ school has proved useful here as it suggests that the decline is a common feature. The project classes did somewhat better than the ‘control’ school but it must be noted that the SATS profile of those four classes (a, b, c, d) is somewhat lower that the four project classes (A, B, C, D).

In order to assess progression rates fairly, it is necessary to take into account the classes’ profiles of mathematical ability. How to do this is of course a matter of much debate. One method might be just to take the teachers’ judgments but that may be considered far too subjective. Another is to look at targets such as Fischer Family Trust or Yellis or some index derived from these perhaps attenuated by teachers’ judgements or local anecdotal data. However, these measures were seen by the project teachers as problematic. In their eyes the different FFT bands, and changes to schools’ bands over time, made comparisons difficult, as did different schools using different derivative indices of their own. Furthermore, the teachers viewed these targets as unreliable and unfair, and amenable to utilisation by senior managers as political weapons (“sticks to beat us with”). Other possible measures, arguably objective and acceptable are SATS mathematics levels and GCSE mathematics grades, and these two will be used here.

Table 4.6 indicates that the three of the four project classes (and the ‘control’ d) had broadly similar progression rates for SATS 8 students. The exception was class C who had taken GCSE a year early with limited success and had found FSMQ Additional Mathematics too demanding. A similar picture appears for SATS 7 students. It should be noted that in some cells there were only very small actual numbers from which the percentages were derived. (Note: In Table 4.6 a zero indicates that there were students with this level in the class but none progressed to Y12 mathematics, whereas a dash indicates that there were no students with that level.)

% of students with this SATS level progressing to study Mathematics in Y12
School / Teacher
(Class) / No / SATS 8 / SATS 7 / SATS 6 / Overall
A / A / 29 / 64 / 57 / - / 62
B / B / 22 / 86 / 53 / - / 64
Bx / 25 / 78 / 73 / 0 § / 72
C / C / 29 / 18 / 24 / 0 § / 21
Cx / 28 / 63 / 24 / 33 § / 36
D / D / 29 / 63 / 38 / - / 45
‘Control’ / a / 28 / - / 13 / 0 / 11
b / 28 / - / 18 / - / 18
c / 25 / - / 6 / 0 / 4
d / 28 / 79 / 44 / - / 68

§= percentage in this cell calculated from N < 4

Table 4.6: Progression rate (%) from Y11 to Y12 by SATS level

The raw data underlying Table 4.6 are presented in Appendix G, Table G3.

Discrepancies between tables
Some discrepancies occur from table to table. These are due to incomplete data. Causes for this include: a student missing a questionnaire through absence, failing to answer a particular question, being an immigrant and not having a SATS level score, a student leaving (or arriving) or frequently absent, a questionnaire being unidentifiable due to a coding problem.
Having such incomplete data can lead to different populations on which calculations are based. Such problems, though usually minor, affect several tables. For example, the number of students recorded in classes B and Bx in Table 4.3 are lower than the corresponding numbers Table 4.2. Similarly, progression rates for classes Bx and C differ slightly from Table 4.5 to Table 4.6.
In individual tables and graphs which compare results from the Initial Questionnaire and the Final Questionnaire the only students included are those who have answered the relevant question(s) both times.

Table 4.7 indicates that the three of the four project schools (and the ‘control’ school) had broadly similar progression rates for GCSE grade A* students. The exception was school C but the numbers achieving A* there were small. Progression rates for GCSE grade A students were similar across all schools, with school A just a little lower. (Teacher A pointed out that many in his set were extremely capable in other subject areas and chose to pursue humanities subjects in Y11, although this is an argument which might be made at any level.)

School / Teacher
(Class) / No / Set / GCSE A* / GCSE
A / GCSE
B / GCSE
C / GCSE
D / Overall
A / A / 29 / 1/10 / 77 / 47 / 100 § / - / - / 62
B / B / 22 / 1/4 / 100 / 70 / 40 / 0 § / - / 64
Bx / 28 / 1/4 / 80 / 80 / 57 / 0 § / - / 71
C / C / 29 / 1/4 / 0 § / 57 / 8 / 14 / - / 21
Cx / 28 / 1/4 / 100 § / 56 / 14 / 33 § / - / 36
D / D / 29 / 1/4 / - / 67 / 47 / 17 / - / 45
‘Control’ / a / 27 / 4/16 / - / - / 33 / 0 / 0 / 11
b / 27 / 2/16 / - / 75 / 20 / 0 / 0 / 19
c / 25 / 4/16 / - / 100 / 0 / 0 / 0 / 4
d / 28 / 1/16 / 85 / 78 / 17 / - / - / 68

§= percentage in this cell calculated from N < 4

Table 4.7: Progression rate (%) by GCSE grade

The raw data underlying Table 4.7 are presented in Appendix G, Table G4.

In Tables4.8 a–d we compare, as fairly as possible, the project classes’ GCSEresults with those for comparable previous year or parallel classes.

School / Teacher
(Class) / Set / Year of Exam / School year / No / GCSE A* / GCSE
A / GCSE
B / GCSE
C
A / A
(project) / 1/10 / 2008 / Y10 / 30 / 13
(43%) / 16
(53%) / 1
(3%) / -
Ax
(year after) / 1/10 / 2009 / Y10 / 30 / 24
(80%) / 5
(17%) / 1
(3%) / -

Table 4.8a: Comparative GCSE data for School A

School / Teacher
(Class) / Set / Year of Exam / School year / No / GCSE A* / GCSE
A / GCSE
B / GCSE
C
B / B
(project) / 1/4 / 2009 / Y11 / 28 / 5
(18%) / 12
(43%) / 7
(25%) / 4
(14%)
Bx
(parallel) / 1/4 / 2009 / Y11 / 27 / 6
(22%) / 14
(52%) / 6
(22%) / 1
(3%)
By
(year before) / 1/4 / 2008 / Y11 / 35 / 12
(34%) / 22
(63%) / 1
(3%) / -
-
Bz
(year before) / 1/4 / 2008 / Y11 / 33 / 13
(39%) / 15
(45%) / 5
(15%) / -
-

Table 4.8b: Comparative GCSE data for School B

School / Teacher
(Class) / Set / Year of Exam / School year / No / GCSE A* / GCSE
A / GCSE
B / GCSE
C
C / C
(project) / 1/4 / 2009 / Y10 / 29 / 2
(7%) / 9
(31%) / 14
(48%) / 4
(14%)
Cx
(parallel) / 1/4 / 2009 / Y10 / 29 / 2
(7%) / 7
(24%) / 14
(48%) / 6
(21%)
Cy
(year before) / 1/4 / 2008 / Y10 / 30 / 4
(13%) / 11
(37%) / 14
(47%) / 1
(3%)
Cz
(year before) / 1/4 / 2008 / Y10 / 32 / 2
(6%) / 11
(34%) / 14
(44%) / 5
(16%)

Table 4.8c: Comparative GCSE data for School C

School / Teacher
(Class) / Set / Year of Exam / School year / No / GCSE A* / GCSE
A / GCSE
B / GCSE
C+D
D / D
(project) / 1/4 / 2009 / Y11 / 29 / 0
(0%) / 6
(21%) / 17
(59%) / 6+0
(21%)
Dx
(other) / 1/4 / 2009 / Y11 / 33 / 7
(0%) / 5
(21%) / 15
(59%) / 6+0
(21%)
Dy
(other) / 1/4 / 2009 / Y11 / 30 / 1
(3%) / 0
(0%) / 8
(27%) / 20+1
(70%)

Table 4.8d: Comparative GCSE data for School D

In Tables 4.9 a–d we attempt to compare as fairly as possible the four project classes’progression rates for 2008-9 with those for previous years or comparable classes.

School / Teacher
(Class) / Year / Progression / No / Progression rate
A / A
(accelerated) / 2009
(Project) / AS (Y11) to A2 (Y12) / 29 / 13
(45%)
Ay
(not accelerated) / 2009
(other) / AS (Y12) to A2 (Y13) / 29 / 14
(48%)

Table 4.9a: Comparative progression data from Y11 to Y12 for School A

School / Teacher
(Class) / Year / No / Progressing to study Maths in Y12
B / B / 2009
(Project) / 22 / 14
(64%)
Bx / 2009
(other) / 25 / 18
(72%)
Ba / 2008
(other) / 34 / 20
(59%)
Bb / 2008
(other) / 34 / 25
(74%)
Bc / 2007
(other) / 29 / 19
(66%)
Bd / 2007
(other) / 29 / 17
(59%)

Table 4.9b: Comparative progression data from Y11 to Y12 for School B

School / Teacher
(Class) / Year / No / Progressing to study Maths in Y12
C / C / 2009
(Project) / 29 / 7
(24%)
Cx / 2009
(other) / 29 / 12
(41%)
Cy / 2009
(other) / 30 / 15
(50%)
Cz / 2009
(other) / 32 / 16
(50%)

Table 4.9c: Comparative progression data from Y11 to Y12 for School C

School / Teacher
(Class) / Year / No / Progressing to study Maths in Y12
D / D
(project) / 2009 / 29 / 13
(45%)
Dx
(other) / 2009 / 33 / 12
(36%)
Dy
(other) / 2009 / 30 / 6
(29%)

Table 4.9d: Comparative progression data from Y11 to Y12 for School D

In Table 4.10 we compare Class A’s ASresults for 2008-9 with those for the non-accelerated class (students one year older).

School / Teacher
(Class) / Year of Exam / School year / No / AS
A / AS
B / AS
C / AS
D / AS
E / AS
U
A / A
(Set 1/10) / 2009
(project)
accelerated / Y11 / 28 / 8
(29%) / 8
(29%) / 3
(11%) / 4
(14%) / 2
(7%) / 3
(11%)
Ax
(Set 1/10) / 2009
not accelerated / Y12 / 33 / 2
(6%) / 10
(31%) / 9
(27%) / 6
(18%) / 0
(0%) / 6
(18%)

Table 4.10: Comparative AS data for school A

4.6 Students’ Initial and Final Questionnaires – five indices

4.6.1 Introduction

Full details of these two questionnaires are given in Appendix A. The Initial Student Questionnaire (September 2008) and Final Student Questionnaire(April 2009) were identical in having 12 questions on a five-point scale to measure their perceptions on each of:

(a) Confidence

(b) Teacher supportiveness

(c) Usefulness of mathematics to themselves

The responses were converted into three indices (score 0 to 48).

There were also 10 questions on a five-point scale to measure each of:

(d) Enjoyment

(e) Value (Usefulness) of mathematics to society

The responses were converted into two indices (score 0 to 40).

A reliability analysis test was conducted on each of the indices for each of the Initial and Final Student Questionnaires. This tests whether a set of questions form a coherent whole and lead to a reliable instrument to measure a concept. The Cronbach alpha values obtained were all high (nearly all exceeding 0.9) indicating a high level of reliability. Further item analysis indicated that none of the indices would be improved if any items (questions) were omitted.

All the graphs in section 4.6 include students who completed both questionnaires.

Most graphs presented are boxplots. Appendix F explains the features of boxplots.

It is assumed that the Initial Student Questionnaire (September 2008) measured views of the students acquired in the previous academic year (or earlier) whereas the Final Student Questionnaire (April 2009) reflects the effect of the experience in the project year 2008-9.

4.6.2 Students’ ‘Confidence’levels (in their ability in mathematics)

Graph 4.1 indicates that there was no appreciable change in Confidence levels (except perhaps a decline for class C. There is a pronounced increase in outliers (shown as filled circles) and extreme outliers (shown as asterisks) at the lower end for the Final Questionnaire (April 2009). Teachers offered the explanation for this as being that some students were either feeling the examination pressure at that time or had come to realise that mathematics was not for them after all. (Results for the four ‘control’ classes in the non-project school were overall similar, with very slightly lower levels recorded.)

Graph 4.1: Confidence levels for the four project classes

4.6.3 Students’ ‘Enjoyment’levels

Graph 4.2a indicates that Enjoyment levels held up well. School D is notable in eliminating the lower end outliers and markedly increasing the range.

Graph 4.2a: Enjoyment levels for the four project classes

Graph 4.2b shows that the project classes overall had a higher median Enjoyment level than the ‘control’, but with more spread.

Graph 4.2b: Enjoyment levels for the ‘control’ school and the four project classes

4.6.4 Students’ perceived ‘Teacher Support’ level

Graph 4.3a indicates that students’ views on Teacher Support increased for two classes (A and D). Graph 4.3b shows that the perceptions of the ‘control’ classes were lower.

Graph 4.3a: Teacher Supportlevels for the four project classes