Research and e vidence -informed p ractice : focusing on practice and practitioners

Philippa Cordingley

Centre for the Use of Research and Evidence in Education

Abstract

This article explores issues raised in knowledge transformation processes through three separate, but related lenses. It starts with a discussion of the relationship between knowledge transformation and research outputs. In so doing, it encompasses both direct research outputs, that is research reports and papers in journals, and indirect outputs, such as practitioner research summaries and web and paper based resources specifically designed to engage teachers with research. The paper then moves on to explore the implications of transforming research knowledge in relation to the environment in which such knowledge is to be used. In particular the paper focuses on the learning processes involved in the progression from reading research texts to putting them to work in classrooms. It also considers the contribution that evidence about Continuing Professional Development (CPD) can make to our understanding of the process of transforming knowledge into practice. Finally, the paper concludes with a case study of the specialist skills involved in brokering and mediating this process, illustrating them through a case study.

The article proposes that the transformation of knowledge from research into classroom practice involves a mix of complex processes, some of which need specialist mediation and dedicated resource. It proposes, too, that reflecting on knowledge transformation as Continuing Professional Development (CPD) and learning helps to elucidate some of the steps on the way.

The contribution of research text outputs

Conventional research outputs are, perhaps inevitably, significantly shaped by the requirements of research funders and research assessment systems. These imperatives focus on the validity and reliability of the knowledge that has been generated, and facilitate peer scrutiny via a cumulative process of testing and refinement of the knowledge base. However, the individual teacher engaging with a research report so that it can be applied ‘in practice’ must often wrestle alone with challenging levels of abstraction.

Teaching is, by contrast, a practical and interpersonal enterprise. For this reason, practitioners need to connect intellectually, practically and emotionally with the knowledge they are offered in the research accounts if they are to take it on board and use this to inform their practice. They are a particular audience and, as with all other audiences, their specific needs and contexts need to be taken into account if the texts are to “speak to” them. It follows that academic writers need to have a practitioner learning perspective in view as they write. Unfortunately however, they are an audience whose needs are rarely structured into research funding and accountability systems.

Perhaps it is not surprising therefore that very little research writing is directly targeted at a practitioner audience. The UK Research Assessment Exercises and custom and practice in the editing of research journals rarely leans in this direction (Edwards et al. 2005; OECD 2002). The journals that are focused on teacher audiences tend to emphasise the practical over the theoretical, speed over depth and descriptions of practice over analysis and evidence. In between the spaces left by published journals, a range of national agencies have attempted to create research summaries, tools and resources for practitioners, including the DfES Research Informed Practice Site (ndards.dfes.gov.uk/research/); GTC E Research of the Month (e.org.uk/research/romtopics/); TLRP Practitioner Applications (p.org/pa); the What Works Clearinghouse (.gov/ncee/wwc/) and Learning Research and Development Center (c.pitt.edu/) in the US; SCRE Spotlights (e.ac.uk/spotlight/index.html); and Education Counts (cationcounts.edcentre.govt.nz/index.html) in New Zealand.

The form of research outputs

What do we know about the form that research outputs need to take in order to make an effective contribution to knowledge transformation? In England, the National Teacher Research Panel (NTRP, 2000) analysed a range of research texts to identify writing practices likely to be accessible to practitioner readers and to draw them into engagement with research findings. A study by NFER (Kerr et al.1998) echoed many of their findings Both studies highlighted in particular the need for research writing to:

1. provide sufficient, detailed analysis and description of the teaching intervention or knowledge in action vividly to enable teachers to connect with it and to test it out for themselves;

2. provide detail about the starting points of pupils and the communities, phases or subjects involved in research outputs in order to enable teachers to interpret and adapt for their own context;

3. include a short summary of the methods;

4. layer evidence and writing so that practitioners can make informed choices about reading strategies;

5. provide clear pathways for finding out more about the research;

6. provide evidence about the connections between an intervention or approach and pupil learning; and;

7. ensure clear, simple, short and jargon free writing.

Such admonitions are perhaps easier to describe than to achieve, especially in the context of the pressure upon researchers for recognition in publishing regimes driven by knowledge production.

Items one and two are similar in kind to parallel requirements for detail regarding methods and samples, but sufficiently distinctive to demand additional work and words thus creating additional pressure on resources and space. In item three, the request for short summaries of methods sits in direct contradiction to the requirements of peer testing. Items four and five are similarly inappropriate for a researcher audience. In looking for attention to be given to connections with pupil outcomes (broadly conceptualised) item six connects with important debates within the research community. (Hargreaves, 1996; Hammersley, 1997; Gorard, 2001) But the two debates should not be confused. This did not amount to a request from teachers for over simplified approaches to cause and effect. Instead it is a reflection on what interested these teachers. Evidence about pupil learning captures their attention. But teachers understand, and indeed experience daily, the complexity of intervening variables. The need identified is for research papers that explore this complexity in attempting to track connections.

At one level, the concluding request for simple, clear, short and jargon free writing simply relates to tone and register. Who would not prefer new knowledge to be expressed thus? Clearly technical terms properly explained and defined have an important role in explicating concepts, increasing understanding and enhancing control over detail even though, to practitioners they may read as jargon. But it is certainly also the case that shorthand terms accumulate in the education community, as in all other communities. Few such terms are essential and many create barriers to wider communication. But there are more complex factors at work here than command of prose. Clarity of writing also depends on clarity of conceptualisation and of analysis. The process of research writing is in itself often more of an analytic than a communication process. Exposing ideas to the discipline and meaning of prose is an important test of the coherence and stability of thinking. Perhaps the clarity and brevity practitioners are seeking is sometimes only possible after a period of iterative testing and refinement of ideas in the technical journals?

P ractitioners’ needs in relation to research outputs

The National Teacher Research Panel (2000) and Kerr et al (1998) thus seem to call for additional and new, or at least differently presented, research accounts. Some of the requests such as a desire for detailed descriptions of context and interventions make demands that are relatively infrequently given priority in research writing for peer review. Others would constrain research writing in ways that could undermine the accumulation of knowledge through peer review. So it is unsurprising and helpful that a series of complementary, research based resources for practitioners has been developed that are based on the principles and needs identified above. Such texts try to create a series of stepping stones between research reports and articles and day to day classroom practice. These are described in detail and analysed against their various purposes including:

· encouraging and/or supporting practitioners in interpreting testing and refining strategies from research in their own context;

· providing access to theory/the underpinning rationale to enable transfer;

· enabling practitioners to relate products to own experiences;

· securing understanding of core facts and issues;

· awareness raising re: range of useful research;

· investigating the issues of interest to practitioners.

·

in a review of ten years of active support for research and evidence informed practice published by the Innovation Unit, funded by the English Department for Education and Skills (Centre for the Use of Research and Evidence in Education, 2007).

Perhaps one central strategy can helpfully illustrate the change in writing that such a change in audience and perspective engenders - the use of questions specifically designed to connect practitioners’ management experiences with research findings. Questions are self evidently central to learning and much scholarship and research has concerned itself with the role of questions in student learning starting with Bloom (1956) and continuing for example, through the application of original ideas to specific teaching and learning intervention as in Adey & Shayer’s (1994) work on the development of thinking skills interventions. Questions, of course, play a central role in shaping research and the collection, analysis and interpretation of evidence too. The research summaries described above explore the proposition that questions in texts can also make a pedagogic contribution. Examples of experiments in making use of questions can be found in a range of work focused on supporting the transformation of knowledge from research including the General Teaching Council for England’s (GTC E) (e.org.uk/research/romtopics/), Research of the Month summaries of large scale research linked to teachers’ own research case studies, in the English Department for Children, Schools and Families (DCSF), Research Informed Practice website (ndards.dfes.gov.uk/research/) and in the Practitioner Applications (p.org/pa/) developed to draw teachers into research outputs from the Economic and Social Research Council’s UK-wide Teaching and Learning Research Programme (TLRP).

The strategies for making such texts attractive, informative, engaging and useful are, of course as various as the research they capture and the audiences they seek to engage. They depend on the combined skills of information scientists, communication specialists, teachers and researchers. They depend too on relevance. Research capable of transforming practitioner knowledge needs to be relevant to it. As long ago as 1993, Huberman (1993) argued that the academic research agenda should be informed and partly supported by collective analysis of practitioners’ own research questions.

The Teaching and Learning Research Programme has taken a different angle, seeking evidence that researchers have involved practitioners in identifying research questions, designing research processes and interpreting results. These efforts seem to have met with increasing success over time. In 2001 the NTRP (2001) found little experience of practitioner involvement at the start of the projects, by the end of the school sector phase of the work. In 2007 many TLRP reports reflected a process of co-development or co-construction pursued in partnership by researchers and practitioners. These resulted in increasingly close connections between research design for teachers with a direct role and focus in the creation of the knowledge in the first place. Some significant progress seems to have been made. However, the systematic analysis of teachers’ own research questions, called for by Huberman, still seems, to this author at least, to be an urgent priority.

The organisation of the knowledge base

However well written the texts may be, however carefully structured to support learning and save teachers’ time, there still remain important challenges in getting the right texts to the right people at the right time. How are teachers usually connected with research texts? The situation varies significantly between contexts according to the degree of central control operated by governments and their education departments and the extent of their interest in providing and supporting use of research as a recent working paper for OECD (2007) shows.

In England research summaries, tasters or reports may be introduced and/or mediated by people such as HEI colleagues with CPD roles, local authority and/or teacher colleagues Rickinson, (2005). Or teachers may access such knowledge through trusted professional journals (such as NUT’s Education Review, or through websites that facilitate explicit quality assurance as well as access. For example, GTC’s Research of the Month (e.org.uk/research/romtopics/), publishes appraisals of the research that is featured that work to a systematic framework and are written for a practitioner audience in an attempt to support the development of practitioner evaluation skills. Subject associations and specialist centres also provide quality assurance. The Association for Science Education (.org.uk/) and the (English) National Centre for Excellence in the Teaching of Mathematics (tm.org.uk/) websites are two widely used examples.

There is also evidence of increasing pressure on practitioners to take account of research findings. In England the situation is changing. In addition to providing support resources that summarise and/or point to research resources, national policy frameworks are starting explicitly to promote, sometimes even require teachers to keep up to date with research evidence. The new National Standards for teaching require teachers both to support each other’s learning through coaching and, by implication, by being coached, within a framework derived directly from three systematic reviews of the evidence about the impact of CPD. This framework explicitly highlights the importance of engaging with evidence from research and from classrooms. This (English) national framework for mentoring and coaching highlights, as a core skill, the process of identifying specialist evidence from research that is capable of informing and enhancing practice. Similar encouragement and support through use of the framework is embedded in the work of the General Teaching Council for England’s Teacher Learning Academy, the National College for School Leadership’s work on leading coaching and in the provision of the National Strategies.

Another significant development for transforming knowledge from research in England is the extensive delegation of resources and responsibilities for CPD from central and local government to schools. In the context of increasing requirements that schools attend to evidence about effective practice Continuing Professional Development, leaders in schools are beginning to act as knowledge brokers. Research outputs are being introduced and/or drawn down by teacher colleagues with responsibilities for leading learning in subject areas or phases within and between schools. For them, knowledge management issues such as key wording and organisation of the text in ways that make it quicker to identify relevance for particular people are very important. Web and taxonomy technology is helping here but there is some way to go. Experiments in developing web portals abound but are often uncoordinated. Searching the internet produces many more hits that schools based knowledge brokers can deal with. Few web sites make serious efforts to connect research materials with the needs of identifiable groups of teachers at particular stages of development. The English Teacher Training Resource Bank (b.ac.uk/) for initial teacher training educators represents an interesting and increasingly popular exception. It may be the fact that this portal is geared to the needs of a specific audience to meet a specific need has been helpful both in securing and organising the resources needed to undertake the formidable and extensive work of filtering, quality assuring and consistently meta tagging research resources? Perhaps, too, the fact that this particular audience has its home within the higher education system facilitates dialogue about their needs.