/ Assessment and Feedback Programme, Strand C
Briefing Paper

OMtetra: Open Mentor Technology Transfer

Briefing Paper

Project Information
Project Title (and acronym) / Open Mentor technology transfer (OMtetra)
Start Date / September 2011 / End Date / January 2013
Lead Institution / University of Southampton
Partner Institutions / Open University; King’s College London
Project Director / Lester Gilbert
Project Manager & contact details / Alejandra Recio-Saucedo,
Project website /
Project blog/Twitter ID / @omtetra
Design Studio home page /
Programme Name / Assessment and Feedback Strand C
Programme Manager / Paul Bailey and Heather Price

Contributors

Alejandra Recio-Saucedo

Denise Whitelock

Lester Gilbert

Stylianos Hatzipanagos

Pei Zhang

Paul Gillary

  1. Summary
  • The OMtetra project has been successful in taking up Open Mentor and complete its transference into two HEIs: the University of Southampton and King’s College London. Interest shown by tutors from the institutions involved has translated into ideas to facilitate assignment analysis through Open Mentor and to encourage adoption of the system across institutional departments.
  • OM analyses, filters, and classifies tutor comments through an algorithm based on Bale’s Interaction Process. As a result, tutor’s feedback comments are classified into four categories: Positive reactions, Teaching points, Questions and Negative reactions. The feedback provided is analysed against an ideal number of feedback comments in a graded assignment. Reports are provided in OM to support tutors in the task of reflecting on their feedback structure, content and style.
  • The main outcome of the OMTetra project was the effect on tutors of having access to a report of their feedback analysis. The opportunity to reflectin feedback practice has the potential to affect positively on students’ learning and performance. By supporting tutors' feedback practices through a strong formative function where the tutor can use the output of the system (reports and classifications) to engage in reflection about the quality and appropriateness of his/her feedback, students are more likely to receive feedback that is useful and develop skills that will allow them to be active participants in their learning.
  • Tutors’ experiences with Open Mentor provided a clear picture of the question: how does pedagogical advice derived from a computer system promote tutor reflection on their feedback? Open Mentor is an enabler – a tool for reflection – and it is essential that it does not promote a superficial (‘surface’) approach to feedback. It is the pedagogically informed design of appropriate and constructive feedback that underlies the success of technology-enhanced assessment by helping tutors to reflect on how they are using feedback to guide the learner, and by contextualising that reflection in a pedagogically sound framework. Open Mentor was evaluated as a valid and useful tool to improve practice. The reflective activities and the pedagogical gain from using an assessment technology such as Open Mentor are what tutors need to support learners engage in self-reflecting opportunities and obtain maximum benefit from assessment.
  • Open Mentor was originally designed to help tutors develop and enhance good feedback practices (Whitelock & Watt, 2007b). It was based on research which established a close link between an assessment’s grade and a classification of comment types based on (Bales, 1950).
  1. How can the innovation be used to support assessment and feedback practice?

A user scenario

The School of Education is changing the syllabus for the taught module of Learning Technologies. As part of these changes, there is now an online system to evaluate students’ submitted work. Within this evaluation system, tutors must provide qualitative feedback on students’ performance on assignments against the learning objectives of the module. Both experienced faculty members and teaching assistants would like to receive training development on the provision of the type of feedback required of them.

The original concept of Open Mentor – as a tool for quality enhancement, supporting tutors and institutions as they work to improve the quality of student feedback, quality as ‘fitness for purpose’ in enabling an effective learning process – has proven deployable and testable within higher education institutions.

OMtetra evaluated the transfer and implementation of OM via two questions:

1.How generalizable is the pedagogical model of Open Mentor to institutions beyond its original development community?

2.How does pedagogical advice derived from a computer system promote tutor reflection on their feedback?

With regards to the first research question on how acceptable is the pedagogical model used in Open Mentor for institutions other than the Open University, results were mixed. On the one hand, tutors felt the Bales coding system was acceptable, but on the other, they suggested that implementing other models would allow subjects which use different assessment strategies to take advantage of the technology. These results are consistent with the findings of Whitelock et al. (2003), recommending study of other subject domains to determine the suitability of Bales coding system across higher education curricula. Alignment with the Bales category system is important as it offers an objective evaluation of the quantity and quality of the comments, in particular the socio-emotive and motivational aspects of feedback.

Feedback from the lecturers revealed that expanding the category system and integrating other capabilities to the analysis frameworks (e.g. text analysis) would enhance the potential of Open Mentor to improve tutors feedback practice in a wider range of subject domains. An interesting approach to this would be incorporating a dynamic algorithm incorporating machine learning for classification using natural language processing techniques (Sebastiani, 2001; Watt, 2009). This addition to the analysis capability of Open Mentor would help address the needs of individual institutions where feedback practice is aligned to that of the culture of the organization.

One consistent message learned from this deployment of Open Mentor in a range of higher education institutions (Whitelock et al., 2012a, 2012b) is that the more configurable Open Mentor is, the more attractive it will be for adoption across institutions.

Turning to the question of whether pedagogical advice from a computer system could improve tutor practice, this was generally welcomed by tutors – when it is truly aimed at enhancing feedback by providing an analysis explicitly based on a strong theoretical foundation. Open Mentor now does have this foundation, and is now at a stage where it can be transferred and implemented technically, speaking, easily and efficiently, offering an almost immediately useful tool for training and in teaching situations.

Tutors were also conscious that the reports should be disclosed to the tutor alone, and not made public. While the reports that compare tutors side-to-side are ideal tools for programme leaders, the cultural change required to openly discuss the style and type of feedback provided by tutors could raise significant challenges to acceptance by staff.

Tutors’ experiences with Open Mentor provided a clear picture of the second research question: how does pedagogical advice derived from a computer system promote tutor reflection on their feedback? Open Mentor is an enabler – a tool for reflection – and it is essential that it does not promote a superficial (‘surface’) approach to feedback. Successful e-feedback lies with the pedagogy rather than the technology itself (Gilbert, Whitelock, & Gale, 2011). It is the pedagogically informed design of appropriate and constructive feedback that underlies the success of technology-enhanced assessment by helping tutors reflect on how they are using feedback to guide the learner, and by contextualising that reflection in a pedagogically sound framework, Open Mentor, was evaluated as a valid and useful tool to improve practice. The reflective activities and the pedagogical gain from using an assessment technology such as Open Mentor are what tutors need to support learners engage in self-reflecting opportunities and obtain maximum benefit from assessment.

  1. What are your findings from the implementation of the innovation in partner institutions?

The OMTetra project has been successful in taking up Open Mentor and completing its transfer into two higher education institutions. Interest shown by tutors from these institutions involved has already translated into ideas to facilitate assignment analysis through Open Mentor, and to encourage adoption of the system across institutional departments. Development of the recommended Open Mentor features, and promotion for its adoption on a larger scale, are on-going efforts that will help HEIs use Open Mentor’s techniques to deliver quality feedback support in the teaching and learning process, and through that, improve development of learners’ self-reflection skills.

  1. Where can I find out more?

Interested in installing Open Mentor?

Open Mentor is open source software. Source code and relevant documentation can be downloaded from the public github repository available at:

Further information and useful documentation can be found at the project website. If you are a systems administrator or staff interested in implementing or using OM, go to:

References:

Gilbert, L., Whitelock, D. M., & Gale, V. (2011). Synthesis Report on Assessment and Feedback with Technology Enhancement. Retrieved from

Sebastiani, F. (2001). Machine Learning in Automated Text Categorization. ACM Computing Surveys, 34(1), 1–47. Retrieved from

Whitelock, D. M., Watt, S., Raw, Y., & Moreale, E. (2003). Analysing tutor feedback to students: First steps towards constructing an electronic monitoring system. ALT-J, 11(3), 31–42. doi:10.1080/0968776030110304

Watt, S. (2009). Text Categorization and Genre in Information Retrieval. In A. Göker & J. Davies (Eds.), Information Retrieval: Searching in the 21st Century (pp. 159–178). John Wiley & Sons.

Whitelock, D. M., Gilbert, L. H., Hatzipanagos, S., Watt, S., Zhang, P., Gillary, P., & Recio Saucedo, A. (2012a). Addressing the challenges of assessment and feedback in higher education: a collaborative effort across three UK universities. INTED 2012. Valencia, Spain. Retrieved from

Whitelock, D. M., Gilbert, L. H., Hatzipanagos, S., Watt, S., Zhang, P., Gillary, P., & Recio Saucedo, A. (2012b). Supporting tutors with their feedback using OpenMentor in three different UK universities. CBLIS 2012 (pp. 105–116). Barcelona, Spain.