Learning about learning from an on-line course

Authors: Harlen, Wynne, University of Bristol, UK and Doubler, Susan J. TERC and Lesley University, Cambridge, MA, USA

Paper presented at the British Educational Research Association Annual Conference, Edinburgh, September 11 – 13, 2003

This paper was also presented in the Symposium on Teacher Education and Professional Development in on-line Courses: Lessons from Recent Research and Development at the AERA 2003 Annual Meeting, April 21-25, Chicago

Abstract

This research compares the processes and outcomes of learning in a course studied online with one studied face-to-face. The course was Try Science, an online Master's programme in teaching science for secondary teachers. It is designed to facilitate understanding of science content, the nature of inquiry, and learning and teaching through inquiry. The research questions were: What is the nature of the learning experience for students studying online and face-to-face? How do the processes and outcomes of learning online and face-to-face compare? What are the implications of the findings for the development of online and of face-to-face courses?

Data were collected from three groups of teachers, two studying Try Science online (spring 2001 and 2002) and one studying the same content in a regular on-campus course (autumn 2001). Embedded assessment was used where possible as, for example, using a "thought experiment" that was part of the course, as a measure of understanding of the science content. Findings show that time spent on the course by online participants was greater than on-campus participants, but with considerable variation among online participants. Using online postings and classroom observation data, differences were also found in their reflection, conceptual understanding and the inquiry process. Online participants made more explicit reference to recognizing the value of collaborative learning. It was evident that, for all participants, a greater amount of time was needed for change to take place in their established classroom practices.

Introduction

This paper reports a two year study, funded by the NSF (ESI –9911770) of the learning processes and outcomes of Try Science studied on-line compared with the same course content studied in a regular face-to-face course (Harlen and Altobello, 2003). Try Science was developed, through collaboration between TERC and Lesley University, as an on-line course. It is the introductory module in a master’s course in science education for elementary and middle-school teachers. The whole course has the dual aim of developing teachers’ understanding in science and improving their pedagogic skills relating to teaching and learning through inquiry. In the first six sessions of the 13-week course the focus is on learning science. During this time, participants conduct, at home, investigations relating to the properties of ice and water, floating and sinking and dissolving. After a period for reflection on their own learning, the focus turns to thinking about children’s learning and to what is required to involve children in learning through inquiry. In the final weeks, the participants apply what they have learned in this model, in designing and teaching a short section of work, which they report and evaluate on-line. After the first week, participants are divided into groups of six or seven and are expected to respond each week to the posts of others in their group. The science and pedagogy sections of the course are facilitated by two different people on-line.

Research questions and research design

The purposes of the research was to answer the questions:

  • What is the nature of the learning experience on-line and on-campus?
  • How do the processes and outcomes of on-line and on-campus compare?
  • Are there learning outcomes that are more readily achieved through one form of study than the other?
  • What features of on-line or on-campus courses might be incorporated into the other to optimise learning in both situations?

Data were collected about the on-line course in spring 2001 (with 15 participants) and some similar data were also collected about a second on-line course in spring 2002 (with 13 participants). For the purpose of the research a 13-week on-campus course was mounted at Lesley University as an addition to the usual program of professional development courses. Teachers were recruited locally for this course and no attempt was made to assign participants randomly to the on-line or on-campus courses. The on-campus course was taught by one person and ran in the fall of 2001 (with 18 participants). The second on-line course was studied for the purpose of replication and because one of the facilitators was the teacher of the on-campus course, thus giving a little control of the teacher variable. The course studied are summarised in Table 1

Table 1: The courses studied

Course format / Time / Facilitators / Participants
On-line (1) / Spring 2001 / 2 / 15
On-campus / Fall 2001 / 1 / 18
On-line (2) / Spring 2002 / 2
(one being the teacher of the on-campus course) / 13

The extent to which the on-line and on-campus courses were similar in all but mode of delivery is problematic. Try Science was ‘purpose built’ for the on-line environment and translating the course material for face-to-face delivery was not straightforward. The weekly course material was spread across the week for the on-line participants whilst the bulk of the work was concentrated into a three hour weekly session for the on-campus participants. There were, of course, differences in the mode of interaction (synchronous for the on-campus course and asynchronous for the on-line course), in the flexibility to adjust the pace of activities to suit participants within a week and between weeks, and also in the role of the facilitator (the subject of a separate paper). Thus the research did not deal with a simple independent variable of course mode and is best thought of as illuminating understanding of learning in the two environments.

Data collection

In order to answer the research questions data were collected for both course formats about:

  • The participants’ learning experiences (nature of interactions, time spent, reflection, etc.).
  • The participants’ learning outcomes (understanding of the science, perceptions of inquiry, of learning through inquiry, etc.).
  • The facilitators’ (instructors’) experience.

Decisions about how to collect these data were made so that the research interfered as little as possible with the processes of the course. Parts of the course were used as sources of information and the minimum additional demands were made on participants and facilitators. Thus, for example, the ‘thought experiment’ that was introduced at the end of the ‘science’ weeks was used as evidence of understanding of the science, and the lesson plans that participants were required to produce at the end of the course were used as data about the extent to which they were applying parts of the course relating to inquiry teaching. For the purposes of the research, participants were also asked to complete a thought experiment before the start of the course. The main additional time required on account of the research was for completing a questionnaire before the beginning of the course and again at the end, for responding to weekly questions about how much time had been spent on various activities required by the course, such as reading, journal work, hands-on investigation, etc., and for the end of course interview. Table 2 summarises the types of data gathered and the methods of data collection.

Table 2: Types and methods of data collection

Data collected / On-line / On-campus
Personal background / Pre-course questionnaire / Pre-course questionnaire
Course experience / Reading and analysis of postings
Weekly record of time spent kept by participants
Post-course questionnaire and post-course interview / Observation and analysis of each session; video recording of 3 sessions.
Weekly record of time spent kept by participants.
Post-course questionnaire and post-course interview
Change in understanding of the science presented in the course / Pre- and within-course responses to ‘thought experiment’
Post-course questionnaire / Pre- and within-course responses to ‘thought experiment’
Post-course questionnaire
Change in understanding of the meaning of inquiry in science / Pre- and post-course questionnaire / Pre- and post-course questionnaire
Change in view of inquiry teaching / Pre- and post-course questionnaire / Pre- and post-course questionnaire
Change in confidence in teaching science / Pre- and post-course questionnaire
Post-course interviews / Pre- and post-course questionnaire
Post-course interviews
Application of strategies for inquiry teaching and learning / Lesson plans developed within the course / Lesson plans developed within the course
Change in classroom practice / Sample class observations
Self-report in post-course interviews. / Self-report in post-course interviews.
Facilitators’ experience / Post course interview
Time log / Post course interview
Time log

We cannot report on all the findings, but focus here on three only: the course experience, change in understanding of the science and change in confidence in teaching science.

Findings

Course experience

Collection of data to report the learning experiences of participants presented the greatest challenge. Collecting the posts made by on-line participants provided a complete record of interactions and was a relatively simple matter. However, it was breaking new ground to turn these records into data for analysis and interpretation. There were no examples that could be found in existing work in this area. The system eventually devised for categorising the content of the posts involved categorising all posts of each participant session by session, using a set of categories reflecting the goals of Try Science. Comparison of posts across participants for the two on-line courses (spring 2001 and spring 2002) were not significantly different despite having different facilitators. Subsequently, comparisons were only made between the first on-line and the on-campus course, partly because some data for the second on-line course were incomplete.

For the on-campus course, the equivalent record of the course interactions and experiences was made by at least one of the researchers attending each session. Notes of the class events were made throughout against a timeline. In addition three entire sessions were video-taped. However, these could not record the individual experiences in the manner that was recorded for the on-line course in the posts. Moreover, they referred only to the experiences in the weekly sessions and not to any individual work done between sessions, whilst the posts of the on-line participants covered all the interactions during each week. Records of time spent, kept by the participants, suggested that there was little interaction or practical work outside the class sessions. For the purposes of comparison with the on-line courses, the observer's notes were summarized using the same categories as the posts.

The results showed that in both on-line and on-campus courses there was frequent experience of using science inquiry skills in the first half of the course. In both, the incidence of ‘extending the investigations’ was limited and there was less frequent explicit ‘use of evidence to test predictions’ than anticipated. There were marked differences between the on-line and the on-campus courses in three categories related to reflection: reflecting on their own conceptual understanding of the science; reflecting on their own learning processes and reflecting on the inquiry process.

In all cases there was a greater incidence of these kinds of reflection for the on-line participants. A possible reason here is that the asynchronous communication provoked and enabled participants to take time before answering more thoughtfully and indeed the on-line materials often asked participants to ‘share their thoughts’ in their posts. There was less evidence of this explicit encouragement in the observations of the on-campus course, although the face-to-face situation could well be considered to make this unnecessary.

Other differences, where there was a higher incidence recorded for the on-line course, were in regard to the teacher’s role in asking questions to promote inquiry, in identifying inquiry skills in action, and in applying aspects of the course to their own thinking or experience– again related to reflection.

Finally, difference in categories referring to ‘recognising collaborative learning’ and ‘valuing first hand inquiry’ again with the higher incidence for the on-line course were interesting, given that the on-campus participants were more obviously learning collaboratively. It may be that on-campus participants would not find it necessary to comment on something which seemed so much part of their normal course experience. It may also be a further indication of the greater reflection on their learning processes that was evident in the on-line participants.

Time spent by participants

A more readily quantified part of the participants’ experience was the time spent on their studies. Each week on-line participants were sent by e-mail a list of questions about their use of time on the course. On-campus course participants were given similar questions to answer about the time they spent between weekly meetings. These records were collected by the observer each week. A notable feature of the results for the on-line course was the very wide range of time spent each week across participants. Rough inspection of posts showed that those who spent longer tended to post more often. However when averaged, the times by the second on-line participants were very similar to those for the first on-line course.

The overall average across the whole courses showed that the on-line participants spent just over two hours more per week (a little over 40% more) than the on-campus students. That is, the on-line learners spent just over 7 hours on average per week, whilst the on-campus s learners spent just over 5 hours (including the three hours of the class sessions). Comparisons of reported time spent on particular activities where there were comparable data showed that on-line participants spent about three times as long as on-campus participants on reading course materials (not including posts) and about twice as long on communicating with their fellow course participants by telephone or e-mail (apart from one week where the activity required such communication in order to produce a combined report).

Change in understanding of the science

As a measure of understanding of the science concepts covered in the first six weeks of the course, two ''thought experiments'' were introduced in session 7 as part of the course material (see Figure 1).

Figure 1Thought experiments.

Note: This is a mental exercise. You do not actually conduct the suggested experiment; you merely think it through.

Choose one of the following scenarios:

Scenario 1: Iced Salt Water

You put a glass of salty water in the refrigerator and let it sit undisturbed overnight. In the morning, you take the glass out of the refrigerator, place it on your kitchen table, and carefully slip in three fresh-water ice cubes.

Your 'thought experiment':

Describe what you would see happen in the glass over the next two hours. (Assume that the solution is mixed to the same concentration as in the Adding Salt to the Picture investigation.) Justify the details of your description using evidence gathered from our investigations over the last six weeks.

Scenario 2: Below-zero Tide Pool

Imagine that you live near the Atlantic Ocean in Maine, where you can easily observe a closed tide pool that contains about one cubic metre of water. It’s January, and the weather has been unusually mild, averaging 17oC (63oF). But for the past three days the air temperature has been –5oC (23oF) and even colder temperatures are expected for the next seven days.

[A diagram of a tide pool cross section indicates that it is one yard wide and one foot deep at the middle]

Your 'thought experiment':

Describe what you think you would see if you visited the tide pool today. Then describe what you think will happen in the tide pool over the next seven days. Justify the details of your description using evidence gathered from our investigations over the last six weeks.

In order to establish change in understanding of the concepts developed in the course, the same two problems were sent to participants before the start of the course, at the same time as the pre-course questionnaire. On the second occasion, for the purposes of the research, participants e-mailed their answers to the ''thought experiments'' to the facilitator, so that the understanding of individual participants could be assessed. They also sent their answer to two other participants in their group so that the responses were discussed and modified before being shared across all participants in session 9. Thus, the ''thought experiments'' were used formatively, for help learning, as well as summatively, to measure learning.