Page 2

Evaluation of the Higher Education Academy Subject Centre for Geography, Earth and Environmental Sciences Workshop Programme

Final Report

Iain Nixon,

Robert Stafford,

Steve Camm,

The KSA Partnership

www.theksapartnership.co.uk

February 2007

Using this report

This report is about an evaluation of the Workshop Programme commissioned by the Higher Education Academy’s Subject Centre for Geography, Earth and Environmental Sciences and carried out by The KSA Partnership in late 2006 and early 2007.

There are some details about our process and methodologies at the back but the report focuses on what we did, what we found and suggests some things the GEES Subject Centre might want to think about.

We would like to thank those people who helped us by giving up their valuable time to answer our questions and talking openly and honestly about the scheme and their experience of using it.

Contents

1.  What we did Page 3

2.  What we learned overall Page 4

3.  What we learned about the Page 6
workshop programme

4.  What we learned about effective Page 8
workshops

5.  What we learned about the impact Page 9
of workshop programmes

6.  Things to think about page 11

Appendix One: Methodology Page 14

Appendix Two: Impact Model Page 17

Appendix Three: Record of Interviews Page 18

Appendix Four: Overview of the Scheme Page 46


1 What we did

1.  The Higher Education Academy’s GEES Subject Centre wanted to evaluate its workshop programme over the last five years. It wanted to build a better understanding of the impact the programme was having and ensure that it was contributing to the improvement of teaching and learning within the institutions engaged. The intelligence captured would be used to inform future activity and potential programme realignment.

2.  We gathered data in four ways:

a.  Semi-structured telephone interviews with a small sample of project leads. We talked to eight institutions (although we tried to talk to 14 in total) that had made use of the workshops programme. They were selected randomly but we took care to ensure a mix of old (three out of six) and new universities (four out of six) and colleges (one out of two) and a spread across the country.

b.  Telephone interviews with the workshop facilitators. We spoke to three workshop facilitators to get their perspective on the programme and asked for information about what works and what could be changed to improve it.

c.  Desk review of scheme information. We looked at the range of themes covered by the workshops and the breadth of the institutions involved over the last five years to ascertain the reach of the scheme.

d.  Desk review of an impact study. We reviewed a study conducted in 2002-03 to identify any evidence of impact originating from the departmental workshops and the learning that has been gained.

3.  Evidence was analysed by consultants from KSA and collated into this final report.


2 What we learned overall

What is our overall impression?

4.  The people we spoke to and the information we reviewed leads us to believe that the workshop programme as it currently stands is highly regarded and is having an impact on teaching and learning practice in the departments involved.

5.  All of the people we spoke to had positive things to say and whilst there were some negative comments and suggestions for improvement, they were hard to come by. This is a composite picture of how the workshop programme was described to us.

6.  Workshops often form part of wider range of activities to support change in departments but they make a valuable contribution, most noticeably in encouraging people to take time out to think through issues and their solutions. There is evidence of solutions being carried through into subsequent enhancements to teaching and learning practice.

How confident are we about the emerging picture?

7.  Even though we only spoke to a small focused sample of institutional coordinators and workshop facilitators there was a remarkable degree of consistency in what they said. In most cases the people we interviewed talked about more than one workshop they had participated in. The composite picture we have put together in this report is based on their experience of over 30 workshops, over a four year period.

8.  This gives us confidence that the picture described to us is accurate enough to give good information about what is working and what could work better in the future. This report presents a composite picture from the research.


What did we learned about the evaluation process?

9.  In general we found it difficult to make contact with people to engage them in the process. Nearly all interviews took several phone calls and emails to set up. But, once engaged all participants gave their time willingly and spoke openly, honestly and at length about their experiences. We think the semi-structured telephone interviews worked and yielded good information.

10.  Whilst there would have been some value in inviting other participants to complete a web based version of the survey. We doubt this would have significantly changed our findings but it would have helped to give everyone the opportunity to comment and may have increased the sample size. However, our experience in other similar contexts suggests the response rate would not have improved too dramatically.


3 What we learned about the workshop programme

11.  This is a composite picture draw from 11 interviews with a representative sample of institutions that have had a collective experience of more than 30 GEES workshops.

People know about the workshop programme

12.  The programme is well publicised on the GEES website and through Planet, and the GEES network was identified as a powerful tool in its promotion. Many workshops are repeat requests (in different areas) from satisfied users – in other words the quality of the scheme and positive interaction also helps to promote it.

Workshops are easy to commission

13.  People commented about the ease of access to the programme. The ‘application’ process is straightforward, simple and appropriate.

Workshop topics reflect ‘live’ change programmes in institutions and departments

14.  Many of the people we talked to were requesting workshops that were in tune with their particular areas of interest at a given point in time. In most cases workshops were being commissioned proactively to initiate or form part of an ongoing programme of change or development within their departments.

Workshops are catalysts for action

15.  A key emerging theme was that workshops were more often than not the starting point in a departmental development process, acting as a catalyst for change and innovation. People said it helped them to initiate processes of change informed by external ‘expert’ perspectives. The external independent facilitation was valued – it helped to draw people in to attend (sometimes in greater numbers than internal events) and was useful in gaining perspectives of others. Engaging with the workshop programme has helped departments to move initiatives on by enthusing and informing members of staff.

16.  Typically workshops had high levels of representation from within the department (typically 80% coverage) which helped to support departmental wide thinking and action. In many cases workshops included a small number of spaces for research students, non-geographers and representatives of other departments/schools.

Interest levels are high

17.  There is a high level of interest and many institutions are serial users. All of the people we talked to said that based on their experience they would use the scheme again, especially where the focus of workshop meets internal departmental developments. Many said it would be one of the first interventions in a process of teaching and learning innovation.

18.  The low direct cost of engaging with the programme was identified as a benefit and led to an appropriate and higher rate of return on investment, bringing an external perspective to an area of interest at very low or no cost.

19.  Institutions seem to like the content, delivery style and contribution made by most workshops.


4 What we learned about effective workshops

20.  We were able to extract from our interviews a shared picture about what makes for an effective workshop:

A good pre-workshop briefing is fundamental to success

21.  People described a process of contact between departments and facilitators before each workshop. This pre-workshop briefing and communication was identified as a vital component in the success of the workshop. Content must be local, relevant and reflect where the department is now and where it wants to be after the workshop.

Practical workshops that accelerate progress work best

22.  Workshop formats with some input and time for discussion are valued more than ‘show and tell’ sessions. Participant’s value being able to see and draw from case studies of good practice and apply them in their own context. People seem to want practical, interactive sessions that share and develop new ideas, facilitate thinking or forward movement.

Creating time to think is the most welcomed factor

23.  One of the most repeated messages we heard was that taking people out of their day job and creating time to think and consider future actions was the most valuable part of workshops. The added value of GEES workshops is that they have been effective in bringing together high proportions of staff from across a department.

The GEES workshop formula is effective

24.  Free, externally facilitated, practical workshops that reflect departments’ current areas of interest and can bring most people in departments together to talk about issues is a highly successful formula for a departmental workshop programme.


5 What we learned about the impact of workshop programme

25.  We looked for evidence of depth of impact in six areas: the departmental lead (the interviewee); other academic staff in the department; the department or school; the institution; the students; and the wider GEES community.

Impact is mostly localised on the individuals who attend workshops

26.  Level of impact varies from department to department but we consistently found that workshops had significant impact on those who attended and some evidence of impact on departments; less on institutions (in terms of changes to teaching and learning practice) and very little on the wider GEES community (limited to sharing some information through articles and conference papers). While this may have taken place, those interviewed had little or no tangible evidence to support their assessment of impact.

Workshops support changes to teaching and learning practice

27.  We found many examples of changes to teaching and learning practice subsequently being implemented by those who attended the workshops. Examples include:


Workshops are catalysts for further work

28.  We found some examples of additional pedagogic research being undertaken by participants stimulated by what they had spent time considering at GEES workshops.

Some evidence of improvement to the student learning experience

29.  We found some evidence, mostly anecdotal, that changes to teaching and learning practice implemented after GEES workshops were improving the quality of the student learning experience. Participants pointed to positive feedback from students but we did not find any evidence to support improvements in progression and achievement. The transient nature of the student body makes assessments of this form problematic.

Workshops support benchmarking and self evaluation

30.  Because GEES workshops draw a high proportion of departmental staff together and use external facilitators and case study examples, an important by-product is the opportunity that the workshop creates for benchmarking and evaluating practice. This process was seen to be an integral part of the improvement process and led to or informed many of the changes in practice.


6 Things to think about

31.  Our overall impression of the departmental workshop scheme is that it is highly regarded, works well and is having a positive impact on those engaged. The scheme is without doubt adding value to the Subject Centre’s offer and is contributing to Strategic Aims 1 and 2, not least by stimulating an active dialogue amongst academic colleagues and sharing what the GEES community knows about the nature of effective practice.

32.  Instead of presenting formal recommendations we have suggested a number of things the GEES Subject Centre may wish to think about based on the evidence we have collected from people who have used its departmental workshop programme.

33.  In deciding how best to move forward the GEES Subject Centre will need to consider each area in the light of their experience in running the departmental workshop scheme and the process they have established to facilitate the scheme. An overview of the scheme and reflections on it has been provided by the GEES Subject Centre in Appendix 4.

Refreshing the subject focus of the workshops

1.  Are the subject areas of the workshops appropriately aligned with future GEES developments and departmental areas of interest?

2.  Does the programme offer sufficient remaining choices to ‘high level’ users?

34.  We think the workshop programme is doing what it set out to but we don’t know how well aligned it is with future developments in the area of GEES teaching initiatives. Its value (and usage) is dependent on it staying in tune with the needs and areas of interest of departments and their wider institutional context. Engaging with departments on a regular basis may help this process and may also help to identify if some departments are being ‘excluded’ because areas they are interested in don’t feature in the current programme.

Embedding the learning and deepening the impact

3.  Could a new level of good practice sharing add value?

4.  Are the findings captured by facilitators shared more widely?

5.  Would a ‘community of practice’ support innovation and sharing of practice?

6.  Is the level of pre-workshop communication appropriate?

35.  A reported inherent quality of the workshop programme has been the sharing of good practice between departments focusing on a common area of interest. The momentum created by the workshops has been reported in many cases to be invaluable. However, engagement by the GEES Subject Centre post-workshop to encourage development work and implementation activity could add value by maintaining this momentum, whilst generating a new level of intelligence to inform future workshop content. Is there an opportunity to bring facilitators together at CPD events to capture learning and examples of effective practice or share common findings or issues across a number of workshops? It was suggested that the annual conference could be used to facilitate this process.