AN INSTITUTION-WIDE PROJECT USING AN ELECTRONIC VOTING SYSTEM FOR ASSESSMENT: THE STORY SO FAR....

K. Robins, E. Gormley-Fleming

University of Hertfordshire (UNITED KINGDOM)

,

An Institution-wide project using an Electronic Voting System for assessment: The story so far....
This paper will consider an institutional perspective of the role-out of an Electronic Voting System (EVS) across a number of schools, with an overarching view that it will enhance the assessment experience for both students and academics. Our initial findings, that the use of handsets benefitted both students and academics, are not dissimilar to that of Draper & Brown (2004). The paper will also demonstrate successes and challenges of such a large scale project.
Three drivers define the need for this project.
• The critical role assessment and feedback plays in supporting learning, developing students’ self-regulation and ultimately enhancing student progression and success.
• Students nationally and locally identifying assessment and feedback as the least satisfactory aspect of their university experience.
• Assessment and feedback often have time pressures and technology enhanced solutions can be both educationally effective and resource efficient.
Across the institution, eleven schools have volunteered to be involved in the project and over 7000 EVS handsets have been issued to students. This institution-wide approach in using technology to increase and enhance assessment and feedback practice has been carried out at three levels; individual student, module and programme level.
The presentation aims to share
• How the schools were supported across the institution
• How EVS links to good principles of assessment
• Some smart things to do with EVS
• Some of the challenges of technology
The next phase will be to introduce a ‘Student Dashboard’ that will collect performance data from a variety of sources and create automated and regular reports of student engagement, with our Managed Learning Environment (MLE).
Ultimately, the project is focused on improving student support; student learning and student engagement in their learning experience.
References:
• Draper S & Brown M (2004) Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning. 20. 81-94.
Please note, this project has recently received funding from the JISC.

Keywords: EVS, PRS, assessment, technology.

1 INTRODUCTION

This paper will give an overview of the roll out of Electronic Voting Handsets (EVS) in teaching across the University. It is hoped that the handsets would provide benefits in learning, teaching and assessment for students. This project is part of a larger project toIntegrating Technology Enhanced Assessment Methods (ITEAM)and is a JISC funded project. The project team all work in the Learning and Teaching Institute (LTI), and the team comprises of Project Director, Project Manager and several LTI teachers.

EVS has been used by enthusiasts in their individual modules for a number of years to good effect. The project aims to pull together this work by offering a central approach. The project started in 2010/11, by developing important relationships with Heads of Academic Schools. The team also considered a number of stakeholders, such as the technical support, student union, disability services and the supplier, Reivo,to ensure the smooth roll-out of EVS across the University.

Our initial findings, that the use of handsets benefitted both students and academics, are not dissimilar to that of Draper & Brown (2004)[1]. These benefits include encouraging students to engage in the module, encourage learning, supports their personalised learning and providing timely feedback. From aacademic’s perspective, EVS can be used to provide formative feedback, correct understanding, provide just in time teaching, add variety and improve attendance and performance

The team decided that they would use the TurningPoint Electronic Voting System, on the basis of its ease of use, ready integration with PowerPoint and that two schools have developed experience in using this product. It has also been tested and worked with our in-house managed learning environment (MLE) called StudyNet.

Initially, schools across the University were offered free handsets if they became part of this project. The team worked with eight partner schools and have approximately 3785 EVS handsets were purchased and deployed to schools. Each school was free to use EVS in whatever way they thought was appropriate within their subject discipline. Two types of handsets were purchased: the LCD RF handset that supports standard multiple choice questions, and the XR handset that would take alphanumeric answers and self paced assessments (homework mode). A breakdown of the distribution is shown below.

Table 1: Distribution of handsets distributed across the University (2010/11)

Academic School / 2010/11 / Handset Type
Psychology / 500 / LCD RF
Computer Science / 320 / XR
Humanities / 575 / XR
Business / 1000 / LCD RF
Education / 260 / LCD RF
Law / 450 / LCD RF
Life Science / 420 / LCD RF
Physics, Astronomy and Mathematics / 320 / LCD RF

In 2011/12, two further schools joined the project, ‘Engineering and Technology’ and ‘Nursing and Midwifery and Social Work’. All schools that needed additional handsets from September 2011 were required to buy their own, albeit through an agreed supplier. There are now over 7500 handsets available across the University.

Throughout this project, we have been conscious to ensure that the use of EVS fits with our Assessment for Learning Principles. These principles were developed by a team led by Dr Mark Russell (Russel et al, 2010) [2], and are synthesised (mainly) from the work of Gibbs and Simpson (2004)[3], Nicol (2007) [4], the NUS and the Weston Manor Group (2007) [5]. These are known as the Assessment for Learning Principles.

2 How the schools were supported across the institution

Each school were required to nominate a school lead. The project director communicated with school leads on a regular basis to provide status updates, give advice, training and information for staff and students. At the start of the project, each school lead was required to develop a project plan outlining the following

  • The scope of your project,
  • Aims, objectives and targets
  • Key milestones and dates
  • Risks to success (preferably with plans to mitigate against the risk)
  • Any training / staff development needs
  • Requirements from us to their project succeed

2.1Technical Support

2.1.1 Information Hertfordshire (IH)
IH provides technical support across the university and have fully engaged with this project. Within this unit, the Learning and Teaching Development Unit (LTDU) have developed a system which tags handsets to individual students and stores the results in StudyNet, the managed learning environment (MLE) for the university.The system created is extremely efficient as it scans the student ID card and the EVS bar code in a matter of only a few seconds per student. Academics are able to access this data by module and produce a participation list for TurningPoint and hence allow them to keep control of individual student performance data.

The technical support team have downloaded the TurningPoint software and attached receivers toall classroom computers across the university. The helpdesk have recently taken on the role of helping academics and students with technical issues related to the use of EVS.

2.1.2 Technology Mentors
Within the University, we also have a number of Student Technology Mentors. These technology mentors have been trained on how to use EVS and are available to support academics as and when required.

2.1.3 Supplier support
Reivo, the supplier,has been very helpful in supporting both academics in using TurningPoint by providing training. They have also supported thetechnical team with hardware issues.

2.2 Pedagogic Support

Pedagogic support has been provided by the project team by developing online resources and links to useful articles on the LTI Knowledge Exchange. These online resources include documents on

  • EVS and the pedagogy e.g. Smart things to do with EVS, Inclusive Practice
  • Useful tips for teachers e.g. FAQ’s,Downloading a participant list, Inclusivity for students
  • How to use TurningPoint e.g. downloading the TurningPoint software, changing channels on the EVS handset.

2.2.1 A one dayschool symposium event
This event ran for schools that had been participating in the EVS project. The symposium included introductory and more specialised “hands-on” EVS training sessions run by UH and Reivo staff. Schools were also encouraged to share their experiences and achievements using EVS within their school and to raise awareness of staff to the potential of EVS and thus extend its use.

2.2.2 Introduction to TurningPoint workshop
This workshop aims to introduce staff to TurningPoint, how and why EVS is used in teaching, discuss the hardware and software, how EVS works with StudyNet (MLE) and give academics experience in developing some simple questions.

2.2.3Effective and Efficient Assessment workshop

This workshop aimed to helpacademics to reflect on their current assessment and feedback practices while considering best practice in this area and to identify how technology may enhance the assessment process.

2.2.4Using EVS for Assessment and Feedback workshop

This workshopaimed to provide an overview of how EVS can be used for Assessment and Feedback, and explore the different ways in which EVS can be used teaching. This workshop also included examples of how EVS linked with the Assessment for Learning Principles.

3 How EVS links to assessment for Learning principles

The team are conscious that EVS is used from a sound pedagogic perspective as it is important that academics think carefully about how and why they are using the system. To this end, the team have identified and are promoting how using EVS linksto our Assessment for Learning Principles, together with the benefits and considerations in using EVS in this way. See the principles developed by (Russell et al,2010), [2] below

  1. Engages students with the assessment criteria
  2. Supports personalised learning
  3. Ensures feedback leads to improvement
  4. Focuses on student development
  5. Stimulates dialogue
  6. Considers student and staff effort

3.1 Three examples of use of EVS in teaching

3.1.1 Drop quizzes during the semester – Assessment Principles 1 2.
The students are advised that there will be a number of drop quizzes that run in the lecture over the Semester. These quizzes will be set as summative assessments that count towards the overall module mark. The students will not be told when the quiz will take place. It could take place at any time during the lecture but is most likely to be at the start or the end. The aim of this is to encourage students work on the module both during and outside the lecture time, for the entire semester. Even if the answers are not provided instantly, the results can be given to students in a timely manner after the lecture.
Benefits include

  • Students can have instant feedback that is individual and personal.
  • Academics can identify the level of understanding in a given topic both for the class as a whole and the individuals.
  • Extra support can be targeted to students who need it.

Considerations include

  • Allow adequate time for the students to answer the questions
  • Feedback must be provided to the students in a timely manner.

3.1.2 Contingent teaching – use student answers to questions to dictate the way the lecture goes forward – Assessment Principles 2 3.
Conditional branching allows you to control the order of slides in your presentation based on the responses received from the audience. You might ask a question covering a specific subject area to assess whether the participants understand the subject. If most of the participants respond correctly, you can skip ahead to the next section of material.

Benefitsinclude

  • Academics do not need to cover material that students already understand. However,
  • Academicscan track the students who did not understand and provide support for them.

Considerations include

  • Ensure students are not voting randomly as this will mislead the lecturer and then the content.

3.1.3 To track individual or class progress – Assessment Principles 3 6
When students have answered either formative or summative EVS questions in a lecture or tutorial, it is possible to track progress of either the student or entire class. The Electronic voting handsets can be to Anonymous (not linked to a handset), Automatic (provides handset number) or associated with a class list downloaded from StudyNet, which will include the name and handset number. When tutors are ready to generate reports they can simply launch the Reports Wizard from the TurningPoint tool bar and can generate a report through Excel or Word.

Benefits include

  • Can identify how the much the class understands and which questions were found difficult.
  • Support can be offered to individuals who are not performing well or engaging in the EVS questions.
  • Academics can provide additional support, just in time teaching, if necessary.

Considerations include

  • Interactive teaching does not guarantee an active learner but will encourage learning.
  • Use of EVS should not distract from content.

3.1.4 Peer assessment – Assessment Principles 1 4
EVS handsets can be used for students to assess their peers on an assessment task, either formatively or summatively. The task is likely to be something visual and may include presentations, short piece of work, debates or a poster. Students use their handset to vote on their peers. Academics must set the grading / marking method that will be used beforehand and clearly articulate this to students e.g. A – Excellent, B – Good.

Benefits include

  • Students engage with the marking criteria and can have a voice.

Considerations include

  • Clear guidance is required
  • Does not capture qualitative feedback from students.

3.2 Lecture plans and EVS

As a means to introduce EVS and highlight its potential of technology enhanced teaching,a series of lesson plans were critiqued to identify where and when students were being actively engaged in their learning. It became apparent that student engagement was not always obvious and there were periods of ‘lecturing’ to the students. It is known that a variety of student activities will engage students and that attention levels are high during first ten minutes but drop as the lecture continues if students are not actively involved (Bligh 2002) [6]. The level of engagement is critical to the overall success of their learning. Simpson and Oliver (2006) [7]identified how interactive lectures that engaged students lead to a greatly improved student learning experience.

The use of EVS as a technology enhanced teaching resource was then mapped onto lesson plans so areas for student engagement could be considered along with traditional methods of teaching. See Figure 1 below. Having regular and timely assessment demands would increase student attention levels and potentially their leaning experience. The use of EVS in the lectures was directly linked to the document created by the team on “101 things to do with EVS” which also embedded the assessmentfor learning principles.

Figure 1

4.0 Resource Efficiency

Assessments should be designed to support student learning, but there is little empirical evidence to date about the time it takes an academic to create the assessment task. This should be seen as intrinsic to the whole assessment process, as well designed assessment tasks should be fair, equitable and transparent and also measure if learning outcomes have been achieved or not.

While it could be suggested that all assessment practices should be of a ‘Rolls Royce’ standard, see Figure 2 below, this may not always be realistic or feasible. It may be better to aim for assessments that are efficient and effective for the relevant discipline.

Figure 2 - (Hornby, 2003) [8]

As a part of this project, resource calculators were created which capture all aspects of the assessment process, see figure 3 below.The initial thinking around the creation of resource calculators was to identify if using EVS would be a more efficient means of managing summative assessment. It is appreciated that EVS is not a panacea in terms of fixing the ills of assessment when student numbers are large. However, it is important to think about how to manage large numbers of students effectively and efficiently through the entire assessment process.

Figure 3 – Typical activities for an essay with resources (time) required

Using EVS for summative assessment compared to other assessments for largerstudent groups can offer an enhanced experience for not only the student in terms of prompt feedback but also for academicin terms of marking and administrative processes. Figure 3 below shows an example of four types of assessment by total assessment time in minutes.

Figure 4 – Assessment process time by size of group

5 Some of the challenges

During the academic year a number of issues have been raised andin most cases resolved.

5.1 Technological Issues

5.1.1Channel Conflict

The most common issue reported was due to the same channel being used in all classrooms and student responses not being received in the correct place , although it presented itself in different ways. Allreceivers have 82 channels but the receiver automatically defaults to channel 41. These receivers work to 200ft, with longer range receivers available for larger venues (range 400ft). Consequently, if a student submits an answer using EVS handset, the response could be picked up by the closest receiver, which may be in nearby classroom. It is not necessary to be running the software; it only requires the computer to be switched on.