Second Report DMU

Second Report DMU

The Managed Learning Environment

at

De Montfort University

Report 6.2

“Managed Learning Environment – Usability Issues”

Anne Jelfs and Mary Thorpe

Institute of Educational Technology

Open University

Walton Hall

Milton Keynes

MK7 6AA

July 2002

Contents

Page

1. Introduction...... 3

2. Evaluation Strategy...... 3

2.1 Determine evaluation measures...... 6

2.2 Pilot Groups...... 7

2.3 Main Evaluation Study...... 7

2.4 Review Student Performance Records...... 8

3. Findings...... 8

3.1 Interviews with Academic, Library and Disability Unit staff....9

3.2 The Student focus group interview...... 12

3.3 A grouped list of findings from the interviews & focus group...13

4. Evaluation of MLE beyond academic year 2001/2...... 15

5. Rationale for questions in the student and academic survey...... 17

6. Conclusions and recommendations...... 20

References...... 22

Appendices

THE MANAGED LEARNING ENVIRONMENT AT DE MONTFORT UNIVERSITY

(JISC 7/99)

Evaluation Report 2

1.Introduction

This is the second formal report in completion of Work Package 6: User evaluation, undertaken by the Open University in collaboration with the project team at De Montfort University (DMU):

John Eyre, Project Leader

Pat Jefferies

Mark Simpson

Open University Evaluation Team:

Professor Mary Thorpe, Director Institute of Educational Technology (IET)

Anne Jelfs, Evaluation Project Officer, IET

Through a series of meetings in 2001/2, the development of the Managed Learning Environment (MLE) and the strategy for its evaluation, both formative and summative, was discussed at a number of project meetings involving Mary Thorpe as the director of the Open University (OU) evaluation project, and since December 2001, Anne Jelfs as the project evaluator. At the meeting on 5th December 2001, a draft evaluation outline developed by the OU in response to the JISC project proposal and specifically WP6 in outline, was presented and agreed with DMU colleagues. Details of this agreed approach and its subsequent adjustments are incorporated in the evaluation strategy below.

2. Evaluation Strategy

Workpackage 6 requires four key tasks from the evaluation:

  1. Determine evaluation measures with the OU and other interested parties
  2. Determine pilot groups
  3. Conduct surveys/interviews/focus groups
  4. Review student performance records

Two deliverables are specified from these studies – 6.1 Interim user evaluation report and 6.2 ‘Managed Learning Environment – Usability issues’. Deliverable 6.1 was completed in February 2002 and contains the results of interviews with pilot groups, an initial evaluation of the CampusNet web pages and a report on accessibility issues for MLE Demo pages, for users with a disability. The remainder of the planned evaluation forms the second report presented here.

It should be emphasised that the evaluation plan in the project proposal was based on the assumption that the MLE would be in general release for student usage during 2001/2, thus making user interviews and performance results feasible sources of data for the evaluation. In the event the evaluation strategy has had to be adjusted to take into account the changes in the timeframe for availability of the MLE to staff and students. The MLE had not been generally released before the start of examinations in 2002 (and has not been widely promoted across the University at the time of writing), and therefore it was not feasible to undertake extensive surveys on take-up and use. Students’ study of their courses has not been influenced by the MLE since it has not been made available to them, and performance data on 2001/2 is therefore irrelevant to the evaluation.

The agreed evaluation goal was to provide an exploration of the usability issues for students, teaching staff and support staff. Only those staff in departments that already use technology in their teaching and in some cases have been involved in discussions around the MLE, have been interviewed. One member of the Library staff was interviewed and a representative from the disability unit. It would have been pointless to interview outside this group since none of the staff has had access to the system and therefore was not in a position to comment on it. The aim has been to find out as much as possible in the circumstances about the perceptions of some of those in the vanguard of usage of IT. Interviews have focused on the provision currently available; their preferences about what they would like to have provided for support of their teaching role, via the MLE and the University infrastructure; and their current usages of IT for their own courses. The evaluation team were directed specifically to focus on the MLE rather than details of VLE usage. However, the two cannot be easily separated, since for teaching staff, their primary interests are likely to be in the course-specific value-added of using a VLE in addition to existing teaching methods and facilities. Further comments are made about this below.

The illuminative evaluation (see Parlett and Hamilton 1972) approach emphasises the importance of the context within which innovations are introduced in education, and this is still an appropriate approach in the early stages of introducing the MLE. The use of semi-structured interviews conducted by the OU evaluator has enabled us to get at both unexpected features of the infrastructure of the University as it affects IT use for teaching, and to capture at least some of the preferences for what should be available and how it should be supported.

Although it was agreed in meetings during March and April that a questionnaire would not be a feasible strategy at this stage, the OU team agreed to prepare a draft questionnaire that could be used in 2002/3 to evaluate the MLE at an appropriate point after its general roll out. The draft questionnaires were discussed jointly with the DMU team and the final amended version is set out in appendices B & C with a commentary in section 5 below.

The heuristic evaluation (Nielsen 1994) was conducted by Mark Simpson and the findings have been fed back to the project team as the basis for further developments of the MLE.

Usability testing with participants from the target population of users is one recommendation made by a number of researchers (Newman & Lamming, 1995, Monk, Wright, Haber & Davenport, 1993). The value is that users can be observed as they use the system and problems and queries identified. Computer users interact with the computer system through a user interface, and for users that interface is the system. It is knowing the user’s experience of the system which helps the designer to understand the difficulties that users have and ways of improving the interface. Our evaluation of the interface was conducted with the aim to identify how effective it was in allowing the user to achieve their goals and how satisfied the user was with the system. At the same time any problems that affect the usability are identified which in turn lead to improvement of the interface.

The Managed Learning Environment incorporates a user-centred interface with a single sign-on that provides a single site for students, teaching and support staff. It has administrative, course specific and social elements, a calendar, email and personal page Staff who wish to use a VLE in their teaching can do so, and in most cases staff use WebCT. The MLE is not currently designed to include direct links to WebCT and staff and students access it via the University’s intranet site

Details on additional information about building MLEs can be found on the JISC website and a framework for Pedagogical Evaluation of Virtual Learning Environments can be found on the JISC website

The specific data collection strategy was conducted under the four headings outlined in Workpackage 6.

2.1Determine evaluation measures

It was agreed that the basis for evaluation measures would be derived from semi-structured interviews undertaken with both students and staff. A sample of staff and students were to be provided by the DMU project team, from the areas with whom participation in the development of the MLE in 2002 had already been negotiated, including: Business, Law, Marketing, Pharmacy, and Computing. In the event this proved difficult to arrange and the staff sample of seven academics were taken from Marketing, Pharmacy, the Built Environment and Social Sciences. The five students were recruited from one post-graduate course, M.A. in Design and Manufacture. One interview was conducted with a member of the Library staff and a further interview was conducted with a member of the Disability Unit (please see Appendix A for all interview comments).

The content of these interviews included the following topics:

  • What has been your involvement in the MLE thus far?
  • What contribution to X (their course/s) do you expect the MLE to make this year?
  • What expectations do you have about how students will use the course related elements of MLE for your course?
  • How often do you expect students to use these elements? How much time do you think students might spend with these online elements?
  • In what ways if any do you think student learning will be helped through your involvement in the project?
  • Do you have any goals for their development of the use of the MLE on your course/area in the future?

These issues and the reasons for particular responses, were explored through semi-structured interviews, and varied in the case of Library and Disability staff in line with their roles and for students.

2.2Pilot groups

The first stage of work with pilot groups included both interview questions and observation of tasks, and the findings from the pilot studies was presented to the project team in Report 6.1 Interim user evaluation report.

2.3.Main Evaluation Study

Conduct surveys/interviews/focus groups

Interviews

The expectation was that the MLE would be available at some point in March 2002 and data collection would be scheduled to take place after that, but before preparation for the examinations became a major preoccupation for students. With the change to the time frame of the MLE delivery, the evaluation study could not take place until late May and early June. The interviews were based on demo versions of the MLE made available to specific groups of staff and students for usability testing. The delay in the MLE delivery did impact on the recruitment of both staff and students for interview. Many students had returned home after their examinations and staff were occupied with grading, exam boards etc..

The format of the student and staff interviews combined observations of usage together with semi-structured interviews, similar to the approach reported on the CampusNet interviews (Report 6.1 Interim user evaluation report). The names of all those taking part in the interviews have been changed to preserve anonymity.

Focus Groups

Due to the impact of the later than anticipated completion of a fully developed MLE, it was only possible to hold one student focus group. These students were post-graduate computing students. Their reported comments can be found in Appendix A.

Surveys

As outlined above, the impact of the later than anticipated fully operational MLE affected the proposed surveying of students. The work of the Evaluation Team has therefore focused on creating questionnaires that can be used to evaluate the MLE and its take-up over time (and beyond the end of the funded project - end of August 2002). We were dependent on the delivery of the MLE within a relatively tight window of opportunity, since until it was available, data collection could not begin. However, the evaluation team has developed two questionnaires for the future evaluation of the MLE. One questionnaire is for student data collection and the other is for staff data collection. These questionnaires have been provided to the DMU team and copies are presented in Appendices B and C.

The future evaluation and data collection from students will have to be sensitive to the requirements of their timetables and availability before the examinations, particularly since many are not available once these are completed.

2.4Review Student Performance Records

Not relevant to the evaluation at this stage (see comments above).

3. Findings

3.1 Interviews with Academic, Library and Disability Unit staff

Seven academics were interviewed at the DeMontfort campus in Leicester using the outline suggested. These were semi-structured interviews so that pertinent topics would be covered although not always in the same order and with greater emphasis on areas which related specifically to individuals and their concerns. A detailed account of each interview is provided at Appendix A, and general themes summarised in the commentary below.

The interface between MLE and VLEs in use

WebCT is the current learning environment in use at DeMontfort and most of the academics interviewed used WebCT to differing degrees in their teaching. It proved impossible to separate out only MLE-specific issues and this is not surprising given the fact that most course-specific developments must occur in the context of the VLE currently. Staff think about the MLE in relation to whether or not it enhances what they can currently offer through the VLE, and whether or not the two environments will be mutually supportive for students. The likelihood of students making frequent and effective use of the MLE is seen in part as an issue of whether VLEs become much more generally used by teaching staff across the institution, so that students expect to be using computers as a regular and essential part of study, not just for occasional social information or networking.

Some staff raised the issue of whether there should be a clearer University policy on VLEs and whether a different platform was going to be introduced, such as BlackBoard. The policy decision had not been made at the time of the interviews. One academic said, “We’re waiting for the university to make clear what is happening about learning environments. The faculty has been told not to make further development until a decision is made.” When he asked when this decision might be he was told September which he says is “actually when we will be short of time.”

MLE and its added value

One of the most active academic users of IT wanted to know what the MLE would contribute and where it might duplicate other university provision. He would like to have a greater understanding of what the students would view and how they would use the MLE. He understands the MLE from his point of view, but not the students’ view. One recommendation by the evaluation team is that there needs to be widespread workshops where academics, administrators and support staff are shown the MLE, how it works and the benefits to all concerned. This includes giving examples of student and other users viewing rights and specific pages.

The need for educational and technical support for academic users

One of the academics suggested by the MLE development team for interview was not a user of any current learning environment systems. However he did conduct course introductory sessions through distance education, therefore he was extremely interested in the opportunity to use the MLE in his work. He is also a member of a Leonardo project (European funding) for which he is the lead person. He wanted to use a virtual learning environment such as WebCT in his teaching, but he feels he needs support to do so. He wants not only technical support, but also pedagogical support. He comments that he is appalled by the demise of the Centre for Education Technology Development (CETD) as he needs that type of support particularly for the Leonardo project and to enable him to put materials on WebCT.

All academics commented on the loss of the CETD support team at DMU. They all thought that without the support they, as academics, were compromised and could not dedicate time to the development of on-line materials, conferences etc for the MLE.

Reliability of the server and infrastructure essential

Academics also commented that the growth in student numbers and particularly overseas provision and part-time students means that the MLE has to support these students adequately and to do this it has to be available at all times. They were concerned that without technical support and infrastructure the students would suffer because of computer down-times.

Support infrastructure required in introducing the MLE effectively

Academics currently using WebCT as the learning environment were doing so to differing levels. There were examples of academics fully integrating the use of the learning environment such as in the marketing department and there were academics who had no experience. As a random sample of the teaching staff this provides some idea of the scale of the difficulties facing the MLE team. The anticipated problems include the need for training, staff support in developing materials for presentation within the MLE and the apathy of some staff in developing a strategy to include the MLE in their teaching.