2 What Are the Headline Achievements of Your Project?

2 What Are the Headline Achievements of Your Project?

/ Assessment and Feedback
MMU TRAFFIC project Institutional Story
Project Information
Project Title (and acronym) / TRansforming Assessment + Feedback For Institutional Change (TRAFFIC)
Start Date / 1 Sept 2011 / End Date / 31 Aug 2014
Lead Institution / Manchester Metropolitan University (MMU)
Partner Institutions / N/a
Project Director / Prof Mark Stubbs
Project Manager & contact details / Bruce Levitan,
Project website /
Project blog/Twitter ID / @mmutraffic
Design Studio home page /
Programme Name / Assessment & Feedback Programme – Strand A (5/11)
Programme Manager / Lisa Gray

1 Summary

The TRAFFIC project aimed to carry out a thorough review of policies, procedures and practice relating to assessment at Manchester Metropolitan University.

The project team used a mixed-methods approach to identify practice in a baseline report, which was widely circulated and accepted through the institution. This led to a series of mini-projects which each aimed to tackle an assessment development priority.

The key technical achievement of the project has been the provision of personalised information about submissions and marks, extracted from the student record system, to the institutional VLE, which has greatly improved the information available to students about assessment requirements.

At the same time, key policy documents have been rewritten in order to ensure clarity and to remove any confusion between the requirements of the institutional framework which needs to support effective processes and the maintenance of academic standards, and decision-making about academic issues such as choices of assignment type and size, and feedback strategy, which need to be retained within programme teams.

2 What are the headline achievements of your project?

  1. Embedding consideration of assessment processes in key institutional documents and projects such as the revised MMU strategy for learning, teaching and assessment and the Enhancing Quality of Assessment for Learning (EQAL) project.
  2. Published a variety of assessment information such as personalized deadlines, feedback return dates and provisional marks to all students via portal, VLE and mobile App
  3. Supported an institution-wide dialogue on improving assessment and feedback that coincides with improved NSS scores
  4. Made ‘top-down’ decisions on assessment patterns relevant to different stakeholders
  5. by listening to their concerns
  6. by providing targeted guidance
  7. Collecting and using qualitative and quantitative data to influence decision-making
  8. Reviewed institutional code of practice on assessment and associated procedures for assignment briefs, marking, moderation and feedback to students.

3 What were the key drivers and assessment and feedback context for your project?

3.1 Background

Manchester Metropolitan University is part-way through an ambitious programme of change in curriculum structures and support affecting its entire undergraduate provision, called EQAL (Enhancing Quality of Assessment for Learning). The TRAFFIC project was driven by a need to review assessment and feedback policies, processes and support to ensure they were aligned with institutional goals of enhancing student satisfaction and success.

As part of the undergraduate curriculum change programme, Academic Board decided to standardise the credit size of modules (increasing it from 10 or 20 to 30 credits) and limit the number of summative assignment tasks per module to two. This decision was taken in response to continued student feedback that there were too many assessment points: up to 20 in a year for some students. The review resulted in a maximum number of summative assignments per student of eight per year, including examinations. These constraints had never been applied to curriculum design before and the implementation of the policy caused a great deal of anxiety among academic staff. Some of the background to this has been explored in the report ‘In the Throes of Change’ (Smith and Lessner, 2011) – Word file, see the DesignStudio website for the full project report.

The TRAFFIC project began in 2011/12 with a baseline report, which combined qualitative and quantitative data to identify the project needs and to set out priorities. The report was based on a review of regulations, assessment statistics and existing information such as outcomes from student surveys, as well as interviews with a cross-section of staff across the institution. It found the following four areas for development:

  1. There is inconsistency in the ways in which information about assignment tasks, submission, feedback and moderation are recorded and communicated to students. A consistent template for assignment briefs is needed, supported by clear guidance on assignment task design and size, developing appropriate assessment criteria, and best practice on feedback and moderation for different types of task. Recording of information about assignment briefs needs to be incorporated into unit specifications.
  2. There is inconsistent practice with regard to the use and recording of assessment criteria. Clearer guidance on the use of grade descriptors and assessment criteria is needed.
  3. There is some concern over the possible clustering of assignment deadlines; there is a need to determine the impact of this on individual students, to consider ways to integrate this into Continuous Monitoring Plans and to provide programme teams with more guidance on effective planning of assignment deadlines within the new undergraduate curriculum framework.
  4. The administration of assignment submission and return to students has been considerably improved in the last twelve months but the system is limited to paper submissions. The system needs to be further developed to include electronic submission and extended to record moderation activity.

These issues focused mainly on processes and policies. They were added to the EQAL goal of timely, personalised information, which underpinned promises made to all students in the MMU Commitment to students, and set specific expectations that:

  • Feedback on submitted work for assessment will be provided within four weeks
  • Timely information will be provided on assessment criteria and examination arrangements,

Institutional resources were allocated to enhance the institutional VLE, Moodle, to support this promise and to integrate online assessment management to make progress on development areas 1, 2 and 4. This work was originally planned to follow on from the evaluation of the Exeter OCM system developed as part of the JISC Course Design and Delivery programme.

A list of project priorities was identified in the baseline report and is included as Appendix 1.

The baseline report was crucial in establishing the importance of the TRAFFIC project. It was widely disseminated and referred to across the institution and led to the project being embedded within the wider EQAL initiative.

3.2 Institutional Context

MMU is a large, diverse university with over 37,000 students. It is organised into eight mainly discipline-based faculties. In summer 2013, these faculties are spread across five sites, but a major construction and campus consolidation programme is under way to reduce the number of campuses to two from summer 2014. In conjunction with this significant investment in estate, the university has re-written its undergraduate curriculum and introduced new business processes and technology designed to enhance the student experience.

Assessment has an impact on almost everyone who works or studies in the institution, so the stakeholder group for this project is very large: all students and their representatives, administrative staff who support students and those who manage submissions, examination boards or quality, staff who provide study skills or disability support, and academic staff (>1,500).

Before the curriculum review mentioned in section 3.1, the university was handling around 620,000 pieces of coursework annually. In 2012/13, this has been reduced to around 400,000.

3.2.1 Specification of Assignments

As part of the EQAL project, an online proforma was designed to capture new unit specifications and store the standardised descriptions in a database. All undergraduate units are now included in this database and the records include information about the type and weighting of the assessment and the skills students are likely to demonstrate in completing the work. This information is linked directly to the student records system (SRS) in order to give a basic picture of the requirements for each unit. For other qualifications, assessment type and weighting information is added manually to the SRS. At MMU pieces of summative assessment are known as Assessment Elements and are identified using a number that indicates its order within the assessment diet, an abbreviated description of the type of assessment and a figure denoting its weighted contribution to the overall mark for the unit. For instance, the Assessment Element identified 1CWK40 denotes the first piece of coursework that is worth 40% of the overall marks for the unit.

3.2.2 Assignment Briefs

At the beginning of this project, there were no particular institutional expectations about the content and format of assignment briefs. Most programme teams had a standardised approach across the programme; some faculties may use a standard system, but there were no specific requirements.

3.2.3 Coursework Receipting System

Assessment submission has historically been managed using systems specific to each Faculty. In September 2011, a system which had been developed in the Humanities, Languages and Social Sciences faculty to log and track submissions was rolled out across the whole university. Known as the Coursework Receipting System (CRS), it took data on coursework assessment elements from the Student Records System to allow students to download a coversheet containing a unique barcode for each submission. Students could then submit assignments to a central collection point, with a guarantee that the submissions would be recovered and scanned within two working days, with the automatic despatch of an email receipt to confirm safe receipt of the work once the coversheet had been scanned. In 2011/12, about 58% of submittable (ie physical objects, mainly paper) assignments were handled in this way.

3.2.4 Marking and Production of Feedback

At the beginning of this project, there were no particular institutional expectations about marking or feedback, nor was there any guideline about the time to mark and return student work. The MMU Commitment to return feedback within four weeks was introduced towards the end of the first year of the project.

3.2.5 Marks Entry and Return of Work

For some years, individual marks have been entered into the Student Record System by the module leaders. In some faculties, academic staff return marked work and feedback to students, whilst in others, this is done by administrative staff. Some staff used the VLE tools such as Moodle assignments, GradeMark or Moodle assessments in order to return marked work to students.

Assessment and student evaluation data were reviewed by programme teams on a module-by-module basis as part of annual monitoring and review.

Student satisfaction with assessment practice prior to the project, as measured by the assessment and feedback questions in the 2011 National Student Survey, was 3.6.

3.3 Project structure

The TRAFFIC project progressed in four sections:

  1. thorough review of existing policy and processes with a view to ensuring that they were all student focused and to seeing which aspects could or should be standardised and enhanced:
  2. review and mapping of existing coursework submission systems
  3. review of existing academic principles and policies
  4. review of the frequency of use of different summative assignment tasks
  5. review of submission dates and impact on individual students
  6. review of academic appeals
  7. staff development and production of supporting resources
  8. specification of technical requirements for institutional assessment management systems
  9. implementation of institutional assessment management systems
  10. evaluation of work done

Sections 3 and 4 are still in progress and have a planned end-date of September 2014. The following section summarises the work done so far in sections 1 and 2.

3.3.1 Review and mapping of existing systems

As a result of the baseline report, the team began to map existing assessment management processes. Whilst the institution has a wide diversity of different types of assignment task and corresponding arrangements for assessment management, it was quickly realised that the same eight stages applied to all assignment tasks. This structure provided us with a simple high-level image to help to organise planning, mapping and communications about assessment management:

Identifying these eight stages has proved very useful in structuring discussions with different stakeholders and in ensuring that the design of systems takes into account all aspects of the lifecycle. The final report indicates what actions have been taken for each of the stages.

Our evaluation is structured around this lifecycle.

3.3.2 Review and mapping of existing policies

Academic policies at MMU are contained in the academic framework, which is mapped to the QAA Quality Code.

The core part of this framework which relates to assessment is the Institutional Code of Practice. This document was thoroughly reviewed and rewritten in the light of the baseline report and other institutional data The ICP has four supporting guidance documents which cover assignment briefs, marking, moderation and feedback. These have also been completely rewritten in order to address the possible issues of inconsistency identified in the baseline report.

3.3.3 Review of the frequency of use of different summative assignment tasks

This review was done in order to ensure that we understood what needed to be covered by institutional systems for the management of assessment, as well as to ensure that we were providing appropriately targeted guidance for the specification and support for different tasks. We also wanted to highlight the possibility of using a range of different types of assignment tasks and the role of these choices in the support of core strategic aims such as enhancing employability and supporting progression. We reviewed only levels 4 and 5 (on the FHEQ system – ie years 1 and 2 of a conventional undergraduate course) in 2012/13, because our new curriculum system was not in operation for level 6 and the data was unreliable (around 30% of the assignments were simply described as ‘coursework’).

The remaining one-third of the assignments at levels 4 and 5 shared around 70 different descriptions, but these could mostly be classified in a similar way to these in terms of function: eg a performance would need the same kind of assessment management as a presentation, and there were many types which would be written and submitted in the same way as an essay. Thus these seven types also provide a reasonable range for the design of assessment management systems.

In 2013/14, we were able to add Level 6 data as well:

3.3.4 Review of submission dates and impact on individual students

Data collected for the university-wide Coursework Receipting System provided students with personalised assessment schedules and the TRAFFIC team with student-level data on submission dates that could be joined with other data to understand factors affecting success.

Demographic, satisfaction, marks, submission and assessment strategy data were joined for 2011-12 and 2012-13 into a single dataset, which was analysed using a range of statistical techniques including Breiman and Cutler’s Random Forest technique for identifying the relative strength of predictive factors.

From 17,354 cases the top 6 predictors of students’ overall satisfaction (in rank order) were:

  • Confidence - The course has helped me to develop confidence and skills to succeed
  • Organisation - The course is well organised and running smoothly
  • Explanation - Staff on my course are good at explaining things
  • Advice - I have received sufficient advice and support with my studies
  • Resources - University resources are appropriate to my learning needs
  • Feedback - Feedback on my work helped to clarify things I did not understand

56% of the variance in overall satisfaction was explained by the model, which showed that on a 1 to 5 range confidence and course organisation can make over 0.6 difference to the overall satisfaction score.

From analysis of 16,557 cases students’ average mark is most influenced by (in rank order) by

  • JACS subject
  • summative assessment factors (total hand ins, timing and bunching)
  • entry tariff
  • origin group

Only 12% of the variance in average mark could be explained by the best model that could be built from available data.

The available evidence appears to suggest that in targeting assessment burden and the provision of practical, personalised assessment information, EQAL and TRAFFIC have been pursuing areas likely to realise intended benefits of improved student satisfaction and success

3.3.5 Review of academic appeals

A review of the appeals process at Manchester Metropolitan University was carried out to understand the high number of appeals being made compared with other institutions[1]. Appeals are a last resort for students and are consequently very stressful. They are also carried out in a quasi-legal way and this requires a great deal of administrative activity, even for the most straightforward cases. The review found that the appeals process itself was robust and that students were generally well supported. However, the review indicated that some changes to assessment management processes might help to reduce the numbers of appeals and thus reduce corresponding stress for both staff and students, as well as to free up resources to support the students with the most difficult mitigating situations.

A number of fairly small changes to assessment processes have been made in response to the report. As well as improving the student experience of appeals, these are intended to have the effect of providing more consistency across the institution in terms of administration and support for students. This will be important in terms of building electronic tracking of appeals into any electronic assessment management system; appeals management is currently treated separately from the main Student Record System, which creates additional work in locating and using essential information from this system in the appeals process.