Evidence-based teaching: advancing capability and capacity for enquiry in schools

Case study

April 2017

Alison Wilkinson, The South Lakes Teaching School Alliance

Table of Contents

Context 3

Literature review 4

A blueprint for an ecosystem? 4

The “What works?” approach 5

A values-led approach 6

Rationale for the project 8

Key Questions 9

The research 9

Hot-housing: effective engagement and embedded practice 11

The process 11

Key learning from the study 15

What is evidence and how do we access it? 15

Sources and types of evidence 15

What ways of working in school encourage engagement with evidence and lead to adoption of evidence-based practice? 17

How can school leaders create the right climate and culture? 18

Conclusions 20

Appendices 21

Appendix 1 – Sample range of evidence related activity over a 2-year period 21

Context

The South Lakes Teaching School Alliance is based on an existing federation of schools in the South Lakes area with well established relationships founded on the principle that we are responsible for the provision of education to all young people in the area. We have always supported one another and worked very collaboratively. When we got together a group of staff interested in research and development from across the schools in the alliance we were able to have a frank discussion about the leadership challenge involved when engaging colleagues with evidence and the factors that contribute to that challenge. An early focus group suggested that one of the key challenges would be “growing” research rather than imposing projects from leadership level: that teachers needed to feel able to put their energies into something in which they felt personally invested. At this early stage therefore we made the decision that this study would not involve the setting up of another project by school leadership. We needed to find a different way to learn more about engaging with evidence and about leading a school where that engagement could flourish.

Literature review

It is rather difficult to find much about the leadership of evidence- engaged schools although the literature review has revealed a very illuminating debate in the educational research community about evidence- based practice and education.

A blueprint for an ecosystem?

There is a substantial amount of literature about evidence-based practice in health and social work. These fields have given rise to the kind of template or “blue-print” for evidence-based teaching expounded for the DfE by Ben Goldacre who advocates randomised control trials(RCTs) whilst calling for an “information architecture” in education so that the results of research reach the profession. (Goldacre, 2013)

Goldacre’s paper for the DfE is an engaging rallying call and it has been taken up by the government who have established the Education Endowment Foundation and initiated a large scale RCT in Closing the Gap which aims to find out which literacy and numeracy interventions are most effective with disadvantaged pupils. This echoes similar initiatives in the US like the What Works Clearing House. Goldacre does however hint that the issue is complex when he writes,

“different methods are useful for answering different questions. RCTs are very good at showing that something works; they’re not always so helpful for understanding why it worked.” (Goldacre, 2013)

He sketches out the “basics” of what the education sector would need to embrace evidence-based practice. These are, in brief:

·  Better systems for disseminating the findings of research to teachers

·  Teachers learning about different types of research during training

·  Teachers refining understanding of research as continuing professional development (CPD)

·  Research networks where teachers consider new pieces of research in context

·  Teachers generating new research questions

·  Multi-disciplinary teams contributing to large scale studies

·  Two-way exchanges between researchers and teachers

·  Academic funders who listen to teachers

This describes an ideal that is very far from our current reality in the UK, in spite of great enthusiasm in some parts of the sector. As Goldacre himself says,

“We are describing the creation of a whole ecosystem from nothing.”

The “What works?” approach

Goldacre’s whole philosophy is predicated on the use of quantitative evidence of “what works best”. Even his final caveat that being a good teacher is not about,

“robotically following the numerical output of randomised control trials” is tempered with a warning about ignoring the qualitative evidence. He does acknowledge though that educators need the “right combination of skills” to get the job done.

Goldacre’s commitment to quantitative evidence, based on RCTs, echoes Slavin’s identification of a key requirement for evidence-based policy as,

“the existence of scientifically valid and readily interpretable syntheses of research on practical, replicable education programs”. (Slavin, 2008)

Slavin looks at America where the Department of Education has set up various initiatives to provide research information to teachers, for example the What Works Clearinghouse(WWC) and Best Evidence Encyclopaedia(BEE). He discusses the methods used in major syntheses of program evaluations in an attempt to support the robust, validated evidence base required in US law as the basis for decision making about educational programs. Like Goldacre, he stresses the importance of a profession able to understand research methods:

“It is ..important that researchers and educators understand the critical issues behind the various program effectiveness reviews so that they can intelligently interpret their conclusions.”

Both Goldacre and Slavin talk about measuring the effectiveness of highly proscribed educational programs or interventions which seems rather at odds with the English system, whereby teachers devise schemes of work to deliver a National Curriculum or exam specification rather than deliver “replicable programs”.

Closing the Gap is currently testing a few commercial intervention programmes in a very expensive study: the likelihood of an initiative being funded and able to provide synthesized program evaluations of sufficient scale and scope, that is then understood and contextualised by practitioners with sufficient knowledge of the issues involved in quantitative studies is however rather remote.

A values-led approach

Others researchers take a more pragmatic approach. Groccia urges his colleagues in higher education to embrace the principles of evidence-based teaching. He argues that

“One does not need to use a meta-analysis of all relevant randomized controlled trials to establish a definition and basis of evidence for good practice”

suggesting instead that any decision to adopt a teaching strategy that considers the outcomes of research,

“would be a clear improvement.” (Groccia & Buskist, 2011)

Biesta goes further than this. His paper, “Why ‘what works’ won’t work” (Biesta G. , 2007) criticises the “technological model of professional action” derived from medicine in which interventions are introduced to achieve given outcomes:

“being a student is not an illness, just as teaching is not a cure.” Students are not passive recipients of intervention but interpret and respond to what they are being taught. He stresses the importance of clarity about educational goals, explaining that education is a “moral practice” rather than a “technological enterprise”. Finally, he asserts that research can only indicate what worked in the past, not what will work, implying that professionals need to make use of research findings to make their own problem-solving more informed.

In his later contribution to the debate, “Why ‘what works’ still won’t work” (Biesta G. J., 2010) Biesta outlines the case for a value-based education as an alternative for evidence-based education explaining that evidence-based practice represents deficits in knowledge, effectiveness and application. He argues that our starting point needs a definition of the educational aims or values we want to achieve and only then should we consider the nature of the evidence we need to develop those values. He pleads for the evidence-based practice to be,

“urgently rethought, in ways that take into consideration the limits of knowledge, the nature of social interaction, the ways in which things can work, the processes of power that are involved in this and, most importantly, the values and normative orientations that constitute social processes such as education.”

Whilst debate continues in the research community it is true to say that school leaders have very little recourse to writing about how we can go about creating the “ecosystem” we need for evidence-based practice to flourish. For this issue we turn to writing about learning organisations. Marsick writes about learning organisations needing the right climate and culture:

“Climate and culture are built by leaders and other key people who learn from experience, influence the learning of others, and create an environment of expectations that shapes and supports desired results that in turn get measured and rewarded.” (Marsick & Watkins, 2003)

Rationale for the project

The literature review reflected the debates we were having in our focus groups. We were asking questions about the nature and source of evidence; the difference between proscription of “what works” as opposed to the professionalism of the action researcher in the classroom and the changes needed to the culture in schools in order to foster the development of evidence-based practice.

Our focus groups gave us a clear principle as a starting point:

“Much educational research seems to come down at us from on high; either from the educationally important such as Dweck or Hattie, or even from SLTs…the harder to reach group are teachers who have been in the profession 10-15 years, who have become guarded about change, and the fact that it is seemingly imposed on them every year. However, it doesn’t take much to re-energise people in that category, if they feel it’s their own agenda driving things.” (Member of Research and Development task group)

We decided therefore that, rather than direct or impose a specific activity, we would approach the development of evidence-based practice rather like gardening: we would sow seeds and then encourage growth through trying to create favourable conditions and climates, but in the main observing and learning. Essentially, we wanted to explore the issues involved in the leadership of evidence-engaged schools to see what we could learn about creating cultures where evidence-based practice develops naturally. The following section gives an overview of all the activity we managed to observe resulting from engagement with evidence that took place over a 24 month period in our alliance. The hope was to track the sources of evidence and identify processes and structures in schools that allowed colleagues to develop practice as a result. Maybe the ways to develop an ecosystem would start to emerge.

In parallel to the observational study we held a series of discussions with different focus groups and collated the emerging issues, particularly in relation to the leadership of evidence-engaged schools. The results of these discussions are set out in the final section of this report.

Key Questions

Our ethnographic study looked at the following key questions:

·  What IS evidence and how can we access it?

·  What ways of working in school encourage engagement with evidence and lead to adoption of evidence-based practice?

·  How can school leaders need create the right climate and culture?

The research

Having made the decision not to create a “new” project but rather to encourage colleagues to engage with evidence in ways that felt natural to them, we needed to become ethnographers, noticing where engagement was- or wasn’t -happening, collecting data about the processes, the outcomes and the responses of colleagues in what became a wide and far-ranging study. The data are summarised and collated in the table in appendix 1.

We collected data on every activity or development that we could identify that either responded to secondary evidence from research or created primary evidence through some type of research activity.

The study tracked the progress of each activity in school, identifying the work flow that either resulted from secondary evidence or actually generated the primary evidence.

Finally, the study assessed the impact of the evidence-based development. We developed a rating system for indicating the impact of the development on practice.

Table 1: Impact of the development on practice

Impact rating / Extent of impact / Description of impact
1 / Mimimal / Unrealised intentions to share finding with others
No real evidence of change in practice
2 / Limited / A small number of individuals directly involved in the study make adjustments to practice
3 / Moderate / Teams take on the development and collaborate on implementation in their department, year team or cross-curricular team
4 / Widespread / Whole school development plan features action plans to implement change as a result of work on evidence-based practice
5 / Embedded / Evidence of implementation can be seen in most classrooms, lesson observations, work scrutinies and departmental meeting agendas and minutes. Pupils, parents and governors are aware of the implementation.

It is interesting to note the range of impacts in the different project groupings. Externally funded and motivated projects appear to have the most limited impact, based on our rating system. These projects do seem to have created significant new learning about issues like Cultural Education or Succeeding with More Able Pupils but there is little evidence that they have brought about organisational change beyond the practice of the individuals involved.

The category with the highest set of impact ratings is the one we have called “Organic Whole School Development Processes.” These are responses to priorities identified by colleagues working in school and are characterised by their “organic” methods: they use teams, processes and work flows that already exist in the school community and use action research methods to adapt those structures and guide the development of practice in response to evidence. All of the evidence-based practice developed as a result of this approach has achieved impacts rated at 4 or 5 on our scale.

As the study progressed we developed a “hot-house” approach to these organic processes, whereby we initiated an action research project by taking a team off-site for a day to review and contextualise available evidence. This particular strategy is something we are determined to repeat and develop now as it has proved to be so effective.

Hot-housing: effective engagement and embedded practice

During the study we used it to develop evidence-based practice around three main areas: