Masters in Educational and Social Research (Mres)

Masters in Educational and Social Research (Mres)

Assessment Careers

Pilot Report

Masters in Educational and Social Research (MRes)

Author(s): Dr Will Gibson

October 2013

Institute of Education, London

JISC Assessment and Feedback Strand A Project

Pilot Project Aims and objectives

The 5 pilot projects each aim to explore the potential and practicality of a longitudinal (assessment careers) approach to assessment with the following objectives to:

a)promote a longitudinal approachto writing feedback (tutors)

b) promote a longitudinal approach to acting on feedback (students)

c)encourage reflection and dialogue over student progress both within modules and between modules

d)develop a set of Assessment Career principles to be used to scale up and embed the longitudinal approach to feedback

e)assess the role of technology in meeting the above objectives.

The intervention made to achieve these aims is different in each pilot. However, all pilots will use the same action research methodology, evaluation questions and undertake the same evaluation process using the same tools to ensure that findings are generalisable at least across the IOE.

  1. Summary

The pilot involves planning, implementing and testing new modes of feedback for students’ formative and summative work. The aims were to createforms of engagement between students and tutors that facilitated a closer dialogue in the feedback process. In particular, the project sought (a) toenablestudents to articulate any difficulties that they have in producing their work (e.g. confusion over concepts, problems of interpreting published work, difficulties in producing an essay structure) and for tutors to address these in their feedback, (b)to encourage tutors to produce feedback that was forward feeding that helped students to identify areas that they could work on in future modules of study, (c) to give students the opportunity to comment on the tutors’ feedback in order to develop the tutors’ reflective practice. The research team designed cover sheets for students’ formative and summative submission. Tutors’ were encouraged to produce comments that engaged closely with these student reflections. The pilot was assessed through interviews with the tutors as well as focus groups with participating students, and interviews who could not attend the focus groups.In total 12 students participated in the focus groups/interviews, and 2 tutors were interviewed.

  1. What are the headline achievements of your pilot?
  • The creation of a close dialogue between tutors and students over the production of assessed coursework
  • Organising feedback so that students had the opportunity to articulate their experiences of both producing the work and of working with the tutor feedback
  • Creating developmental feedback processes that helped students to see their work as part of a ‘learning career’
  1. What were the key drivers for undertaking the pilot?

The context for this pilot included a number of important factors. Firstly, the course being examined here is an entirely online provision, with students participating from all over the world with no face-to-face contact. One of the dominant modes of assessment for this course is through examinations in regional assessment centres. The course uses a number of innovative approaches to create a close relationship with students, including regular interaction through real-time conferencing facilities, and working together through online collaborative activities. However, there remains a concern that the distance-nature of the programme creates more of a social gap than can exist on face-to-face programmes. One of the key drivers in organising this pilot, then, was to generate a closer dialogue over the students’ learning experiences at the point of assessment.

One of the specific issues that students have commented on in course feedback is their uncertaintyand nervousness about assessment. Most of the students are professionals working in fields related to research and education, and many of them are undertaking the MRes as a move to return to studies, frequently after a long break. As such, there is often significant anxiety around assessment, and the expectations of the tutors and the institution more generally. A very particular motivation for this study was to try to make the assessment criteria as clear as possible by actively engaging students in dialogue about feedback. relating them to the students’ own work.

The MRes uses Moodle as its main learning context, with regular classes held through Blackboard Collaborate. The general model of the programme is to have a relatively simple technological architecture; to date, the programme has not made use of social media, e-portfolios or other add-ons, largely because these create much more complex technological issues that can alienate learners and create problems of accessibility. Our aim was to achieve the closeness of dialogue without adding technological requirements; we wanted to avoid using systems such as external e-portfolios or blogs, even though these would clearly enable the kinds of discursive interaction we were looking for.

  1. What was the educational/organisational context in which you undertook your pilot?

The Approaches to Educational Research module assessed in the pilot is a part of the Online MRes. The module is one of six core modules that students typically take towards the beginning of the programme. On average, it has between 20 to 25 students and two tutors. The module is assessed via a coursework essay (20% of their mark) and an end of year exam (80% of their mark). The MRes programme is a cross-institutional provision delivered through the IOE’s Doctoral School in collaboration with the London International Programmes (LIP). LIP provide distance learning to thousands of students across the world, collaborating with colleges across the University of London. The MRes is a relatively small provision, with 80 active students and typical annual cohorts of around 23 students; it is a part time M Level programme that students take a minimum of two and a maximum of five years to complete. The tutors for the programme are all academics within the IOE.

The innovations produced in this pilot have relevance to the International Programmes, to the Doctoral School, and to other M level Distance Learning provisions within the Institute. Clearly the alteration of feedback processes has closest impact on the tutors themselves, as their working processes are affected. However, as this programme is a part of the doctoral school and LIP provisions, any pedagogic adjustments have implications for the consistency of student experience within these broader institutional contexts. Our intention is that the experiences of the tutors in this project will be able to feed into these wider working environments and inform debates about feedback processes within them.

  1. What was the technology context?

Given our aim of creating technologically simple solutions, the project team decided to create dialogue that could easily be integrated into the existing feedback processes. Students upload their feedback to Turnitin dropboxes for both their formative and summative work. The pilot involved the design of cover sheets that students would add to the front of their submission. These were provided as word documents within the VLE that students could download and copy onto their documents. The tutors’ own comments were added into the comments box within Turnitin, as per the existing feedback processes. It was felt that these solutions would enable us to keep all assessment discussions contained within the VLE and, more specifically, in the assessment dropboxes, rather than having them distributed across different parts of the VLE or even across different media or technologies (such as email or e-portfolio). This was important because (a) it kept the dialogue within a secure, password protected environment, (b) it reduced the burden of work on tutors as all of theinformation they needed regarding students’ work was in the same place, meaning that they did not have to do anything different when marking essays, (c) it meant that students would not have to radically change their processes either in order to engage in the project.

  1. How did you approach the pilot?

The main participants in the pilot were the two tutors in the module being evaluated, with input from the broader MRes teaching team, four of whom had input to the design of this project. We also consulted with a learning technologist in the IOE. Following initial consultations, the tutor’s quickly settled on the use of student feedback response forms, eliminating the possibility of more complex technologies because of the reasons discussed above. The student feedback response forms aimed to record the students experiences with feedback in other contexts, to help them reflect on whether further work was required for their essays, and to give them the opportunity to ask direct questions to the tutors. An initial plan for the cover sheets was produced and presented at one of the termly programme team meetings The feedback given here was used to create a second iteration of the form that formed the basis of the pilot. Through this consultation process the form became much simpler than envisioned in the original design. The initial form included four sections for the students to add information into, but the team felt that this was far too complex and that it may put students off participating. The final forms included only two sections each, comprising open text boxes for the students to type into, with very brief instructions on the types of things that they may wish to include within the different sections. The final version of the forms can be found in Appendix 1.

The feedback analysis tool was a useful guide in the design process as it helped to foreground the types of engagement that tutors could make with the students work. We discussed the possibility of including structured guidelines for tutors in their feedback but following consultation with the wider MRes team, it was felt that overly didactic or complicated instructions could make feedback too cumbersome for the tutors. As such, we opted to use the existing feedback mechanisms (i.e. posting comments into the relevant boxes in turnitin) but to simply ask the tutors to refer to the students’ cover sheet reflections in their feedback.

The students were informed of the project and the changes to the feedback mechanisms via announcements on the VLE. They were encouraged to contact the course/pilot leader if they wished to discuss any component of the intervention and changes. The intervention occurred in the spring term of 2013, with 20 students uploading their formative reports and cover sheets in early January and their final assessments in early February. All of the feedback and marking for the module was completed by the end of February. The timing of the focus groups aimed to take into account other assessment deadlines that the students faced and were arranged for April and May of the same academic year. Due to some difficulties of students being able to attend the focus groups a four individual interviews were also arranged. A total of 12 students from 24 module participants were either interviewed or participated in focus groups.

  1. What benefits has your pilot delivered and who are the beneficiaries?

The students who were involved in the focus groups reported that, prior to the MRes – which, for nearly all of the participants, was their first experience of postgraduate study – they had received only very minimal summative feedback on their work. As one of the participants put it “The feedback is what you [got] when you have done the work. It tells us what is good and bad and what you still have to do.” (S1). For some of our students, there were quite clear culturally specific experiences of feedback.

for us, and by us I mean people from Arabic countries, Feedback is always about the professor giving you their voice. It is a very traditional form of education – the professor gives you the correct interpretation of how something should be done. That is what it is there for and that is how we use it. Students treat the feedback as the truth. The truth they need to learn. (S4)

What we see then is that, as with so many other areas of learning, students’ prior experiences of feedback impact on their expectations of what feedback is for and how it should be undertaken. In itself, this is a useful reminder of the variation in students and the cultural distinctiveness of some of the pedagogic assumptions that underpin this project.

The central question that this project aimed to look at how we can improve feedback processes and the nexus of tutor student discussion within it. There were some students who saw feedback as a very distinctive part of learning, which involves a different kind of dialogue to that in the rest of the course: “There is a difference between learning and feedback. The feedback is on the essay. It is about what you write and how you are tested. How they know your level of intelligence.” (S10) A more common view among our students, though, was to see a much more blurred line between feedback and other aspects of the learning process: "I think it is all part of the same process. That you learn through doing the essay just as you learn through doing the course” (S9). In this latter view, some students actually questioned the notion that feedback might be a different kind of process “Feedback can be more than just a kind of power thing [between student and tutor] […] I wonder even if the word feedback is the best. Maybe something like reflective moment is better. […] The feedback is like kind of constant. You are always getting input and comments” (S2). Again, an important point to take from this is that there is not necessarily uniformity in the sense of what feedback is for, how it can be used, and how it relates to the learning process. An implication, then, might be that there is some work to do in creating some reflexivity among the students about these issues.

In terms of the cover sheets that we employed in the study, students did comment that they saw a value in them as a part of the feedback processes: “It is good to have the chance to ask these questions [on the cover sheets] and to find out what we can do to improve our work” (S9): “I like being able to ask the tutor quite directly the area that need work and [what] to focus on” (S3): “I think now, I don’t know how I ever did without them really. How could you do without that” (S7). Other students however felt that while the ideal behind the project was useful, that the cover sheets may not be the best way to achieve the goals: “I am not so sure that having these cover sheets is the best way of having this dialogue […] It is a bit limited. I think Skype sessions would be better. A tutorial. That way, you can really have dialogue, ask questions, do it interactively and stuff” (S2).

In terms of the cover sheet used for submitting formative work, some students reported that they didn’t really know how to use them: “I can remember really not being sure what to write because, you know, it is like a bit of a guessing game when you are like, I just want to know everything about what the tutor thinks and I don’t know what to ask because I just want feedback” (S4). Another student commented that the feedback form was produced as an ‘after thought’, and that this was not maybe the best way to capture the worries and issues they had while writing the essay:

I didn’t look at it until I had written the draft and was just about to submit, so I didn’t spend time like thinking in detail when writing about what to ask the tutor. If I had, maybe I would have written more, so it was more like, at that moment of submitting, what am I thinking, and I am not sure I could remember all the issues, so I just wrote what came to mind. (S6)

There is some suggestion then that the cover sheets may be somewhat imperfect mechanisms for reflection when they are used at the point of submission. This raises the question as to whether an alternative dialogue mechanism that invites students to raise those questions while they are writing may not be a better tool for generating reflection. From a tutor perspective, this would be quite problematic, as holding such a discussion would involve an added workload. In a context where workload management systems are used to monitor working practices, and where the hours allocated for teaching are increasingly squeezed, stretching the time allocated for feedback is potentially very problematic. That said, there is clearly a potential for re-thinking the ways that feedback is provided, and it may be that tutorial systems are a more effective way to manage that if they act as a replacement rather than an addition to written feedback However, the frameworks of learning and teaching quality as specified in Higher Education Institutions’ regulations would normally prohibit the removal of written feedback.

More generally though, there seemed to be a division between students who found that the engagement offered by the forms was useful, and those that were unsure. Some students did say that they found the forms important for gaining a different perspective on their work: “It is good to have that sort of reflection on your work and to think about it from the tutors’ perspective” (S10). In this respect, the process of asking questions about how the work will develop and about the problems faced may have helped the students to gain a new angle on their work. Student 3 also said that he liked this form of engagement: