Title:Technology Enhanced Learning and Testing of Mathematics

Practitioner Paper

AuthorFrank Doheny

Athlone Institute of Technology

KeywordsVirtual learning environment (VLE), SMART Board,Camtasia, Moodle, Screencast; Flash video;

Abstract:

The objective of this study is to develop students’ capacity for self-directed learningby adopting and integrating three separate technologies. The study encourages students to take greater responsibility for their own learning by challenging them to take control of their own continuous assessment marks.

This study takes place in semester 2. I produced short tutorials on a digital whiteboard covering three major sections of the mathematics course. These were recorded and later edited using Camtasia, a screen recording software package. I also developed a set of quizzes in Moodle, the V.L.E. used in Athlone I.T., one quiz for each section. The questions were arranged in ten or more categories, with a minimum of five questions per category. The quiz selects one question at random from each category to produce a personalised test. I uploaded the screencasts into Moodle and advised students to watch them before attempting the relevant quiz.

Students are allowed to take the quiz as often as they wish although penalties accumulate with each attempt. The combined highest marks scored by a student in the three tests forms their continuous assessment mark for the semester.

The students may attempt the tests whenever and wherever they like as the tests are unsupervised open-book exams. If a student is unhappy with the C.A. mark they have achieved, they can opt to sit a formal end of term exam.

I chose first year students for this study as the study promotes independent learning, a skill that is lacking in many school leavers, and because it is not an award year, it will not affect their final mark.

The results will be fully gathered by the last week of March and will be analysed with regard to answering the following questions:

How often was each quiz taken?

Did results improve on repeated attempts?

Did the subjects watch the videos between successive attempts?

What in the subjects view is a satisfactory result?

Within what range of percentage C.A. results are students more likely to sit the traditional end of semester test?

Technology Enhanced Learning and Testing of Mathematics

Aims and Objectives of the Study

Thetwo aims of this study were to encourage students to engage in independent learningand to develop a method of assessment that would act as an assessment for learning rather than simply an assessment of learning. One broad definition of independent learning is “…the ability to take charge of one’s learning”[1]

The objective of the study was to use and integrate three separate technologieswithin a
virtual learning environment (VLE)namely:

1. SMART Board, a digital whiteboard,

2.Camtasia, a screen recording software package,

3. on-line quizzes.

The application of these three technologies was the method used to encourage independent learning in students.

Studentswere challenged to take greater responsibility for their own learning by giving them greater control over their own continuous assessment marks.

Methodology Applied

In order to promote independent learning, I chose to launch this study on a cohort of first year engineering students. This is a skill that is lacking in many school leavers. I wanted to aid their transition from the teacher directed learningenvironment of the secondary school system to the third level system where students are expected to take responsibility as adults for their own learning.

The study tookplace in Semester 2. The three main topics on the mathematics syllabus forYear One in Semester 2 are Series, Integral Calculus and Complex Numbers.I produced elevenshort tutorials using a digital whiteboard. Each tutorial was between three and ten minutes in duration anddemonstrated the method required to solvea standard mathematical problem associated with one of the three aforementioned topics. The tutorialswere recorded and later edited using Camtasia.Although the SMART board has a recording capability the files produced tend to be much larger than the screencasts produced using Camtasia. Wikipedia [2]defines ascreencast as a digital recording of computer screen output, also known as a video screen capture, often containing audio narration. Camtasia allowed me to produce the screencasts as flash videos. Flash video is a very popular file format used to deliver video over the internet and is viewable on most operating systems via the universally available Adobe Flash Player. Producing the files as flash videos also allowed for an added interactive element in the video.

Havingdeconstructed the problem set associated with each topic into ten or more categories I then created a set of five or more questions for each of these categories. The questions were generally of the form where a single numerical answer was required. If more than one correct solution existed, any one of the correct solutions would be accepted.

I uploaded the questions into my course pages in Moodle, the virtual learning environment adopted by Athlone Institute of Technology and developed a set of three quizzes, one for each topic. Each time a quiz was attempted, the quiz selected one question at random from each category to produce a constantly changing personalised test. Students were allowed one hour to complete each quiz.

I also uploaded the screencasts into Moodle and advised students to watch these tutorials before attempting the relevant quiz. Moodle records every file a student opens and thus allowed me to monitor the number of times students viewed each of the screencasts.

Once a student completed a quiz, it was immediately graded within Moodle and the result returned. The questions they answered correctly were shown and the correct answer was also highlighted for any question that had been answered incorrectly. Moodle recorded each test taken by a student which allowed me to check all the answers and check for patterns of incorrect answers.

Students were allowed and encouraged to make multiple attempts at each quiz. I decided thatsmall penalties for incorrect answers would accumulate with each attempt in order to differentiate between those students who did well in their first attempt and those who took several attempts to answer the same number of questions correctly. The highest mark scored by a student in each of thethree tests was combined to form their continuous assessment mark for the semester. Once a student attempted a quiz they were prevented from repeating it for three days during which they were advised to spend more time engaging with the topic and possibly watch the video tutorials.

The students could attempt the tests whenever and wherever they chose so long as they had access to a computer with a broadband connection. The quizzes were unsupervised open-book tests.The students also had the choice to take a traditional end of term examination on the understanding that if they scored a higher mark in this exam than they had gained in the quizzes then this would form their continuous assessment mark. This exam consisted of three questions, one on each of the three topics covered in class. Full marks could be achieved by answering any two questions correctly.

The first quiz on Series was opened at the beginning of week four of semester 2 and remained open for three weeks. The second quiz opened on week seven and again was available for three weeks when it was followed by the last quiz on complex numbers which remained open for two weeks. The traditional end of term assessment was held at the end of week twelve. The first two quizzes were re-opened that week to aid revision and to allow students a last chance to increase their marks.

Results of the Study

There were twenty five students targeted in the study. Seven of these students did not attempt any of the quizzes or the traditional end of term exam and therefore have no continuous assessment marks from semester 2. The average attendance of this group in the mathematics class during semester two was 37%.

Of the remaining eighteen students,sixteen attempted at least two quizzes and sixteen attempted the Easter assessment. The average attendance of this group in the mathematics class during semester two was 78%. The results of the tests are shown in Figure 1 below.

Average Mark / No. of Candidates / Proportion Passed
Series / 50.6% / 16 / 62.5%
Integration / 43.3% / 12 / 50%
Complex numbers / 51.2% / 17 / 71%
Easter Exam / 43.9% / 16 / 44%

Figure 1

Having compiled the data recorded in Moodle, I determined that no correlation existed between the number of tutorial viewings and the scores achieved. Some students watched virtually every tutorial one or more times and still did not achieve a good mark while two others who did not watch a single video file scored very well.However, most students who watched more than half of the tutorials managed to pass their assessment.

Surprisingly some students who scored less than 40% in a quiz did not attempt the quiz a second time.

In all only twenty four repeated tests were made. Of these, three repeats resulted in a reduced score, one remained unchanged and the other twenty improved their mark. Of the three repeated tests where the mark achieved was lower: one student had passed on the first attempt and only spent twenty five minutes on the second attempt, a second student had failed on the first attempt and then did worse in his second attempt and the third student made three attempts in total, his second attempt being his worst but he managed to pass on his third attempt.

In all eight students scored higher continuous assessment marks in the quizzes and clearly benefited from the study. Of the other nine other students who attempted the quizzes but scored better in the traditional test, six had already passed their continuous assessment using their marks from the quizzes.

Based on these results I believe the study was worthwhile and that it will benefit fromfurther development.

Conclusions and Recommendations.

Having completed the study I am now aware of a number of weaknesses in my methodology which will direct me in future years, particularly where first year students are concerned.

I believe it was unwise to inform students at the outset that they could make several attempts, allowing for at most two attempts might have concentrated their effort.

I also believe that allowing the tests to remain open for three weeks was too long and resulted in some students putting the task on the long finger. They did not attempt the quiz while the material was still fresh.

The idea of penalty points might also have had a negative impact on the students and its inclusion might better suit a more mature student cohort.

The “all or nothing” nature of the quiz questions also resulted in poorer scores being achieved by some. While it was designed to concentrate their minds on the accuracy of their work, a small error resulted in a zero score being awarded. I will need to develop a greater variety of question types where marks can be given for applying the correct methodology.

References

  1. Holec, Henri (1981) Autonomy in Foreign Language LearningOxford: Pergamon
  2. Wikipedia. 24 April 2009.Screencast. [Online]. Available at: [Accessed April 29 2009]