On-line quizzes providing formative feedback:

more valuable than seminar attendance and prior study?

Setting the intervention in context.

Slightly over 15 years ago Graham Gibbs and Alan Jenkins (1992) in their seminal work Teaching Large Classes in Higher Education: How to Maintain Quality with Reduced Resources made no mention of on-line assessment to support learning. However,just five years later, the Dearing Report stated that “C&IT will have a central role in maintaining the quality of higher education in an era when there are likely to be continuing pressures on costs and a need to respond to an increasing demand for places in institutions." (NCIHE, 1997: para. 13.2) More recently Cliff Allan in his foreword to the 2005 HEFCE strategy for e-learning wrote of the need to “carry forward strategies based on evidence of what works” (HEFCE, 2005: 1). In the eight years between the publication of the Dearing Report and the publication of HEFCE’s strategy for e-learning there has been a huge growth in the usage of on-line learning in higher education (see for example Jenkins et. al., 2001). Yet, despite this growth and despite the calls for evidence based strategies, there is little detailed evidence of what works and whether it can be replicated.

The intervention and its impact.

With this in mind, this paper explores the impact of optional on-line tests in Legal Method: a first year, first term compulsory undergraduate law module. This intervention coincided with a dramatic change in results. The percentage of fails dropped from 26.8% to 14.4%. The percentage of marks under 50% dropped from 54.9% to 36.3% and the percentage of marks of 70%+ leapt from 6.3% to 15.5%. The impact was even more dramatic if you separated out the quiz takers (n=183) from the non-quiz takers (n=191). Of the quiz takers only 4.4% failed the module, only 18.1% achieved marks of below 50% and an impressive 24.6% gained marks of 70% and above. The marks of the non-quiz takers were very much in line with those of the whole cohort prior to the introduction of the quizzes: 24.1% failing, 54.2% receiving marks below 50% and only 6.8% getting 70%+.

Analysing the impact: a) who took the quizzes?

In order to investigate possible causes of the improvement a detailed analysis was undertaken of one cohort (n=201) to find out more about those who chose to attempt one or more quiz (n=109) and those who did not (n=92). Amongst those entering on the basis of their A-levels (n=157) the qualifications of quiz takers were virtually identical, but fractionally worse, than those of non-quiz takers (22.1 points cf. 22.6). If one accepts the contention that A-levels are a predictor of success (HEFCE, 2003) the expectation would, other things being equal, be that quiz takers and non-quiz takers would perform similarly and yet as seen above quiz takers performed much better.

Interestingly older students (n=37) were found to be more likely to engage with the quizzes than their younger counterparts - a finding in line with Hoskins & Van Hooff (2005) and Mackie (2006), but running counter to the conclusion one might draw from the idea that the younger generation embrace a new style of learning (see for example Veen (2005, 2006) and the concept of Homo Zappiens). Non-British nationals (n=45) and those who entered on the basis of qualifications other than A-levels (n=44) were also more likely to attempt the quizzes. These findings suggest that formative MCQs may be particularly attractive to non-traditional entrants. Given concerns about the performance of many non-traditional entrants on university courses the fact that they engaged with the quizzes and that quiz takers generally did better is perhaps particularly encouraging.

Analysing the impact: b) was quiz taking a good predictor of success?

An analysis of the results in Legal Method based on those students accepted on the basis of their A level grades (n=157) found that the number of quizzes attempted was a better predictor of success than either seminar attendance or entry qualifications. All three variables showed a significant correlation with the results in the module and none of the predictors correlated too highly with each other, r >.9 (Field, 2005). However the correlation between A-level grades and Legal Method performance was only .022 whereas the correlation for both seminar attendance and quizzes attempted was less than .001 showing a very significant correlation. The Pearson Correlation Coefficient for A-level points measured .161, whereas for seminar attendance it was .331 and for quizzes attempted it measured .372 indicating that quiz attempts was the most significant predictor of success. This raises interesting questions as to why the quizzes had such an impact.

Analysing the impact: c) why did quiz takers achieve better results?

Clearly the role of precisely targeted, immediate feedback available whenever the student requires could be the key. Research suggests that students may be benefiting from feedback which is otherwise often lacking in modern day HE (Bone, 2006; Clegg, 2004; Orsmond et.al, 2002). The result could be connected with immediacy of the feedback (Brass & Pilven, 1999; Dalziel, 2001; Hammer & Henderson, 1972; Mazzolini, 1999). It may be, as Driscoll (2001) suggests, that actively engaging the learner increases learning. It may simply be that feedback early in the course boosts confidence and understanding (Bone, 2006). Certainly there is earlier evidence to suggest that those who score well in on-line quizzes perform better overall (Bailey et. al., 2001).However, the findings suggest that it may be more than simply a relationship between quiz taking and engagement. The finding that quiz taking had a greater effect on student performance than seminar attendance raises questions as to whether, possibly, the power of such on-line tests to foster self belief, linking into the idea that people’s motivation and success is influenced by their belief in their own ability.

Analysing the impact: d) in other first year modules

The impact was not confined to the module in which the on-line quizzes were offered. Those who took quizzes in Legal Method outperformed non-quiz takers in all first year law modules despite the fact that none of these other modules included on-line quizzes. Indeed the difference in one module exceeded the 9% difference of average mark achieved in Legal Method.

Analysing the impact: e) over the course of the degree

This impact persisted throughout the degree. Those who engaged with the quizzes in the first year, first term module were more than twice as likely to graduate on time with a 2:1 or first than non-quiz takers despite the fact that none of the other modules included on-line MCQs and as mentioned earlier that quiz takers with A-levels had similar grades to non-quiz takers and quiz takers were slightly over represented in terms of non-traditional entrants (non A-level entrants, mature students and non-British nationals).

Where next?

The question that emerges is whether these results were a one-off or whether they can be replicated in other disciplines and at other universities. The author of the paper is very interested in working with colleagues in other disciplines and at other universities to see if the impact can be replicated elsewhere and to discover more about the factors that influence engagement with on-line MCQs.

Hopefully this paper may stimulate more investigation of these issues.

If anyone is interested in seeing whether this impact can be replicated by adding on-line MCQs to a module which otherwise remains unchanged or who has analysed the impact of on-line MCQs in other contexts the author of this paper would be very interested to hear from you and can be contacted at

About the author

Paul Catley is Head of the Department of Lawat the University of the West of England and a Fellow of the HEA. Since joining UWE Paul has been involved in the creation of a REBEL (Research for Evidence Based Education and Learning) a cross university group at the University of the West of England established to foster discussion of and research into education and learning. From 2006-2008 Paul was also an Associate of the UK Centre for Legal Education. Prior to working at UWE, Paulwas a Principal Lecturer in Law with responsibility for quality enhancement and a University Teaching Fellow at OxfordBrookesUniversity. This research stems from Paul’s time at Brookes.

Acknowledgements

Paul would like to express his thanks to Dr. Lisa Claydon of the University of the West of England for her support and assistance in this research and to Dr. Paul Redford of the University of the West of England for his advice on the statistical analysis of the data, subject to the usual caveat that any mistakes remain those of the author.

Bibliography

Bailey, M.A., Hall, B. & Cifuentes, L. (2001). Web-based Instructional Modules Designed to Support Fundamental Math Concepts in Entry Level College Mathematics: Their Effects, Characteristics of Successful Learners and Effective Learning Strategies. In WebNet 2001: World Conference on the WWW and Internet Proceedings (Orlando, FL, October 23-27, 2001)

Bone, A. (2006). The impact of formative assessment on student learning. Retrieved on 1st December 2007:

Brass, K. & Pilven, P. (1999). Using timely feedback on student progress to facilitate learning. In Cornerstones: what do we value in higher education? Canberra: Higher Education Research and Development Society of Australasia.

Clegg, K. (2004). Playing safe: learning and teaching in undergraduate law. Warwick: UK Centre for Legal Education.

Coates, D., & Humphreys, B. (2001). Evaluation of Computer-Assisted Instruction. Principles of Economics, Educational Technology and Society, vol.4, no.2, pp.133-144.

Dalziel, J. (2001). Enhancing web-based learning with computer assisted assessment: pedagogical and technical considerations. Information Technology, Education and Society, vol.2, no.2, pp.67-76.

Drisoll, M. (2001). 10 things we know about teaching online. Training and Development in Australia, vol. 28, no.2, pp.10-12.

Field, A., (2005), Discovering Statistics Using SPSS, 2nd edn., London, Sage

Gibbs, G & Jenkins, A. (1992). Teaching Large Classes in Higher Education: How to Maintain Quality with Reduced Resources. London: Kogan Page.

Gibbs, G., ed. (1995). Improving Student Performance through Assessment and Evaluation. Oxford: Oxford Centre for Staff and Learning Development.

Hammer, M., & Henderson, C.O. (1972). Improving Large Enrolment Undergraduate Instruction with Computer Generated, Repeatable Tests. Pullman: WashingtonStateUniversity.

HEFCE (“Higher Education Funding Council for England”). (2003). Schooling effects on higher education achievement. Issues Paper, July. Bristol: HEFCE. Retrieved from the World Wide Web on 1st December 2007:

HEFCE (2005). HEFCE strategy for e-learning. Policy Development Paper, March. Bristol: HEFCE. Retrieved from the World Wide Web on 1st December 2007:

Hester, J.B. (1999). Using a Web-Based Interactive Test as a Learning Tool. Journalism and Mass Communication Educator, vol. 54, no.1, pp.35-41.

Hoskins, S., & Van-Hooff, J. (2005). Motivation and ability: which students use online learning and what influence does it have on their achievement? British Journal of Educational Technology, vol.36 no.2 pp.177-192.

Jenkins, M., Brown, T. & Armitage, S. (2001). Management and implementation of virtual learning environments: a USCIA funded survey. Retreived on 1st December 2007 from the World Wide Web:

Mackie, S. (2006) Enhancing Student Learning through Online Interactive Tests. Paper presented to the Faculty of Business Learning and Teaching Away Day, University of the West of England.

NCIHE (“National Committee of Inquiry into Higher Education”). (1997). Report of the National Committee of Inquiry into Higher Education, (Dearing Report) Retrieved on 1st December 2007 from the World Wide Web:

Orsmond, P., Merry, S. & Reiling, K. (2002). The student use of tutor formative feeback. Paper presented at the Learning Communities and Assessment Cultures Conference organised by the EARLI Special Interest Group on Assessment and Evaluation, University of Northumbria, 28-30 August. Retrieved on 1st December 2007 from the World Wide Web:

Veen, W. (2005). Net Generation Learning, Teaching Homo Zappiens. Retreived on 1st December 2007 from the World Wide Web:

Veen, W. (2006), A New Force for Change: Homo Zappiens, Retreived on 1st December 2007 from the World Wide Web:

Paul Catley - HEA Annual Conference 2008 1st July 14:00 - 14:30