Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan

JISC Assessment and Feedback Programme Strand B

Project Information
Project Title (and acronym) / The evaluation of Assessment Diaries and GradeMark at the University of Glamorgan
Start Date / 1 September 2011 / End Date / 30 August 2012
Lead Institution / University of Glamorgan
Partner Institutions / N/A
Project Director / Haydn Blackey
Project Manager & contact details / Dr. Karen Fitzgibbon

Project website /
Project blog/Twitter ID /
Design Studio home page /
Programme Name / Assessment and Feedback Programme – Strand B
Programme Manager / Lisa Gray

Acknowledgements

The project team wish to acknowledge the contributions of students and staff at the University of Glamorgan. Grateful thanks are also given to the JISC programme team for their guidance and support during the project lifecycle.

Table of Contents

1.Executive summary

1.1Brief description of project

1.2Evaluation questions and purpose of evaluation

1.3Brief summary of evaluation methodology

1.4Summary of main findings, and implications of these

1.5Summary of recommendations (Recommendations)

2.Background and context

2.1Purpose of the evaluation and core evaluation questions

2.2Description of the project and its context

2.3Target population and relevant stakeholders involved in the project

2.4Related work/studies

3.Evaluation approach

3.1Evaluation framework or overview

3.2Data collection and analysis

3.3Evaluation limitations

4Evaluation findings

4.1Student and Staff Expectations of Assessment Diaries

4.2Student and Staff Experiences of Assessment Diaries

4.3Student and Staff Expectations of GradeMark

4.4Student and Staff Experiences of GradeMark

4.5Staff expectations and experiences of the tools in relation to workload

4.6Benefits for students with specific learning requirements

4.7Project Dissemination

5Good Practice

5.1Good Practice in Assessment Diaries

5.2Good Practice in GradeMark

6Conclusions

7Recommendations

7.1Assessment Diaries

7.2GradeMark

8References

9Appendices

Appendix 9.1Workflow diagram for administrative staff Assessment Diaries experience

Appendix 9.2Workflow diagram for GradeMark (BEFORE introduction of GradeMark)

Appendix 9.3Workflow diagram for GradeMark (AFTER introduction of GradeMark)

Appendix 9.4Screen shot of Assessment Diaries

Appendix 9.5Screen shot of GradeMark

1.Executive summary

1.1Brief description of project

This project evaluated two assessment and feedback related innovations currently being used at the University of Glamorgan in two faculties (Advanced Technology and Business & Society), aimed at addressing the issue of assessment bunching and the issue of poor feedback to students. The two innovations are 1) Assessment Diaries and 2) GradeMark.

1.2Evaluation questions and purpose of evaluation

Assessment and feedback practice in Higher Education in the UK has long been the major source of student dissatisfaction (National Student Surveys). While technologies are increasingly being used as tools to improve the assessment experience for students and staff, the use of technologies in improving the assessment experience is still patchy

Evaluation questions:

  • How the tools impact on the student experience from a student perspective
  • How the tools impact on the academic staff experience, including pedagogy and workload

1.3Brief summary of evaluation methodology

The evaluation approach was informed by an action research methodology using a ‘look, think, act cycle’ (Stringer, 2007). The look stage of the project is equivalent to the baseline activity associated with identifying current practice. The think stage of the project is related to the data gathering, interviewing and analysis. The act stage of the cycle is related to the recommendations emerging from the earlier part of the action research cycle. In a true action research approach, the current project will continue to inform further ‘look, think, act’ cycles. This means that the recommendations being put into practice as a result of the current project will lead to further action research enquiry.

Two online questionnaires, focus groups and interviews were used to explore student and staff perspectives and experience of using the two tools.

1.4Summary of main findings, and implications of these

The key findings from the project are that both staff and students agreed that the tools when used consistently can improve the assessment and feedback experience.

The Assessment Diaries were found to have met their aims of reducing bunching for both staff and students and providing feedback deadlines in the majority of cases. For the diaries to continue to meet their purpose, they need to contain accurate information, and be used consistently for all assessment deadlines.In particular, the Assessment Diaries enabled improved access to assessment deadlines and dates for students. Students participating in the project reported that this helped with their time and task management. Similarly, the Assessment Diaries also facilitated academic staff in managing the workload associated with setting assessments and coordinating marking deadlines. Administrative staff commented on how the Assessment Diaries streamlined the entire assessment and feedback handling process.

The use of GradeMark is seen by most students and staff as a useful and stable vehicle through which to provide learner feedback. Students clearly see the benefits online feedback could provide, in particular, the easy access to feedback. However, this change in accessibility does not instantly change the established poor practice and attitudes some students experience in engaging with feedback for their own learning. It should also be acknowledged that staff wished to explore a range of online feedback tools of which Grademark is only one option. Students described the benefits as easy access and securely stored assessment and feedback comments that are readily available for them to use for their next assignments. Perhaps the most important aspect of the data from the survey was the realisation that students were fairly open in their expectations of online feedback in comparison with other forms of feedback – it was their actual experience of online feedback which led to their preference about how they would like to receive feedback in the future. It was interesting to see that the improvements students suggested were more from a pedagogical perspective in terms of the quality of feedback and their engagement with that feedback.

1.5Summary of recommendations (Recommendations)

Recommendations for further development of Assessment Diaries

  • Improve flexibility of the current system to enable multiple assessment deadlines for different tutorial groups and single assessments comprising incremental tasks.
  • Provide staff with the opportunity to amend existing assessment information
  • Establish links between Assessment Diaries, Turnitin and Blackboard to facilitate single entry of information
  • Ensure the assessment deadlines are created through collaborative dialogue and not individually determined, to maintain programme overview.
  • Develop an overview of all assessment deadlines for HoLTs, Head of Divisions/Departments and administrative staff
  • Raise awareness of the need for complete and accurate assessment information for all assessment information i.e. date of submission, title of assessment, type of assessment
  • Raise student awareness of the purpose of Assessment Diaries
  • Enhance the automated reminders for students to include: Module code, module title, assessment title, frequency of reminders, personalisation of reminders to stop system from sending reminders to students who have submitted and develop automated alerts to inform students when feedback is made available
  • Establish a grace period of one month from the release of Assessment Diary for the course to enable staff to input assessment and feedback deadlines. After one month, activate the staff email reminders where assessment and feedback deadlines are missing

Recommendations for further development in the application of GradeMark

  • Establish links through BlackBoard between GradeMark and student record system (Quercus Plus) to avoid duplication of mark entry
  • Raise awareness of the student view of marked work so that students understand the functionality associated with GradeMark
  • If tutors are unable to give feedback on the date specified, inform students about delays
  • Raise staff awareness of: the time saved across the entire feedback process i.e. not having to physically collect and hand back work; and the use of FireFox as the primary browser to maximise system speed

2.Background and context

2.1Purpose of the evaluation and core evaluation questions

Assessment and feedback practice in Higher Education in the UK has long been the major source of student dissatisfaction (National Student Surveys). While technologies are increasingly being used as a tool to improve the assessment experience for students and staff, the use of technologies in improving the assessment experience is still patchy. In response to student satisfaction survey and findings from a Change Academy project, the University designed and developed an Assessment Diary to capture submission and feedback dates. Further, the University encouraged the use of online marking through GradeMark. (Details of these technologies are explained further in section 2.2)

The purpose of the JISC assessment and feedback project was to identify the extent to which the tools had an impact on the assessment and feedback experience for students and staff with a focus on two faculties within the University.

The core evaluation questions were as follows:

  • Staff experience and perception of the use of Assessment Diaries and GradeMark
  • Student experience and perception of the use of Assessment Diaries and GradeMark

Having established the core evaluation questions, the project team determined the following topics to be explored further:

  • whetherAssessment Diaries and GradeMark open up a dialogical space for staff and students (Wegerif, 2007)
  • whether the tools meet student and staff expectations in relation to pedagogy and workload
  • whether the tools address the issues highlighted in the baseline data, including bunching and poor feedback

2.2Description of the project and its context

The University of Glamorgan has long identified as part of its strategic vision the key role assessment has as the crucial element to improving the student learning experience. Technology has played a significant part in developing and implementing a revised approach to curriculum design at the University by placing assessment at the heart of learning. Two major assessment and feedback innovations, Assessment Diaries and GradeMark, were introduced as part of the institution-wide Change Academy Project - “Putting assessment at the heart of learning” supported by the Higher Education Academy and the Leadership Foundation in 2009-2010. The tools (Assessment Diaries and GradeMark) have since been slowly introduced and implemented across the institution. As is the case with many innovations in Technology Enhanced Learning (TEL), these tools were interpreted and adopted at different levels across the University.

The tools were aimed at addressing the issue of assessment bunching, communication of assessment and feedback dates and the timeliness and quality of feedback to students. Assessment bunching is a term used to describe what happens when several assessment deadlines fall on the same date. For example, a student studying six modules with varying assessment tasks finds that four of the modules have submission deadlines on the same date. The issues associated with assessment bunching are commonly identified as less time being spent on each of the assessments, thus impacting on the quality of the submissions, and lower attendance in lectures and seminars whilst students are concentrating on the multiple assessments to be submitted.These issues have been frequently highlighted in the literature as major challenges towards improving student and staff assessment and learning experiences. For example, the NUS (National Union of Students), Bloxham and Boyd (2007) and the REAP (Re-engineering Assessment Practices) project have also stressed the need for time on task and speedy and detailed feedback. In addition, assessment bunching was also identified as a key issue within the University from the Change Academy project. The University of Glamorgan project team believes that the technology makes a significant contribution to enabling and enhancing dialogue in and between students and staff.

The Assessment Diary

The Assessment Diary is essentially a simple list of module codes and titles, dates for assessment submission and return of feedback. This information is posted on the institution’s VLE, Blackboard. The diary uses an in house web-based front end which is provided within Blackboard and is personalised giving students and tutors clear, easily accessible information about when assignments are to be submitted and returned. A series of automated reminders to students are generated two weeks before submission, one week before and on the day of the submission deadline. Automated reminders are also sent to staff.

GradeMark

GradeMark is an online marking tool that is part of the Turnitin plagiarism software. The project team view GradeMark as more than a tool to improve the quality of feedback as when used effectively it is seen as a tool that can significantly improve student and staff assessment experiences. This can be achieved by better managing the feedback process and offering students and staff a tool through which to discuss assessment and feedback.

For details of how the assessment diaries and GradeMark led to examples of good practice in technology-enhanced assessment, see Section 5.

2.3Target population and relevant stakeholders involved in the project

The table below illustrates the involvement that key stakeholder groups have had in the project.

Table 1: Stakeholder involvement in the project

Stakeholder / Involvement
Students / Active participants sharing their expectations and experiences of the tools in relation to assessment and feedback.
Academic staff / Active participants sharing their expectations and experiences of the tools in relation to workload and pedagogy
Technical staff and staff developer / Active participants sharing their expectations and experiences of the tools.
Administrative staff / Active participants sharing their expectations and experiences of the tools in relation to workload.
Project team and Steering group / Planning, organising, analysing, evaluating and disseminating the project.

2.4Related work/studies

The project team identified several related projects which have helped inform the background and rationale for this project.

  • Gwella Programme - Enhancing Learning and Teaching through Technology in Wales

The Gwella Programme report provides background information about technology enhanced learning in Wales and the role Glamorgan played in the project.

  • Change Academy Project supported by the Higher Education Academy and the Leadership Foundation - Putting assessment at the heart of learning.– University of Glamorgan

The Change Academy project aimed to encourage academic staff to pilot innovative assessment in line with assessment for learning principles. The use of technology was not a predetermined outcome for the Change Academy project, but the project found that a number of technologies have played a key role in the drive towards greater innovation, effectiveness and efficiency in assessment practice. In particular, the assessment diary was first piloted in one department as part of the Change Academy project and due to its success, it was rolled out across the University.

The evaluation of assessment diaries in the JISC Assessment and Feedback Project provided us with much insight into the benefit assessment diaries have on staff and students’ assessment experiences. It enabled the University to improve the current assessment diaries with specific developments identified by staff and students.

  • Turnitin or Turnitoff Project– University of Glamorgan

The Turnitin or Turnitoff Project was a small scale project that looked at students and staff experiences of Turnitin and GradeMark. The overall finding was that students and staff are generally positive about the use of Turnitin and GradeMark. The project also identified some technical and pedagogical difficulties, such as the initial learning curve experienced by staff, the poor stability of the platform, and confusion with terminology.

The evaluation of GradeMark in the JISC Assessment and Feedback Project enabled us to investigate further whether some of the issues identified in the Turnitin or Turnitoff project. It is clear from the JISC Assessment and Feedback Project findings that the platform has developed much since the Turnitin or Turnitoff project and staff have become more familiar with the tool. The evaluation of GradeMark identified new issues which are detailed in section 4. However, it is heartening to find that staff and studentsremain positive overall about the use of GradeMark. In addition, the evaluation of GradeMark also looks at the experiences of non-users. This is also detailed in section 4.

  • Ebeam Project - University of Huddersfield

The Ebeam project is also funded under JISC assessment and feedback programme. The project also looks at the use of GradeMark and there are similarities in findings that the two projects look forward to discussing further in a collaborative exploration of assessment and feedback technologies.

The following projects helped to inform the project methodologies and evaluation approach:

  • JISCLearner experiences of e-learning

The programme with its focus on learner voice informed our methodologies and evaluation approach in particular with our students interviews and focus groups. For example, theelicitation techniquesspecifically for talking to learners ininterviews.

  • Oxford University Cascade project

The Cascade project provided us with background idea into the way workload issues can be looked at and presented in the project. It was useful to see how the Cascade project break down different process and estimated time for each process in their online assessment handling system.

3.Evaluation approach

The evaluation approach was informed by an action research methodology using a ‘look, think, act cycle’ (Stringer, 2007). The look stage of the project is equivalent to the baseline activity associated with identifying current practice. The think stage of the project is related to the data gathering, interviewing and analysis. The act stage of the cycle is related to the recommendations emerging from the earlier part of the action research cycle. In a true action research approach, the current project will continue to inform further ‘look, think, act’ cycles. This means that the recommendations being put into practice as a result of the current project will lead to further action research enquiry.