The Higher Education Academy Annual Conference July 2006 – Session papers
Automated Support for Writing Development : A Case Study
Norma Pritchett
University of Luton
Steve Briggs
University of Luton
Mark Gamble
University of Luton
Volker Patent
University of Luton
INTRODUCTION
Highly developed language skills are an important asset for higher education students (Ahmed and McMahon, 2006). During their HE career they will be expected to engage with complex ideas which are predominantly mediated in linguistic form. They will be expected to communicate extended arguments, around these ideas, via some linguistic form. Language proficiency is possibly the most important influence on academic development since it affects the way students understand, assimilate and use the information they encounter. Language proficiency has implications, not only for intellectual development and academic achievement, but also for subsequent employment at an appropriate graduate level since the employment market has an expectation that high levels of communication skills are associated with graduaterecruitment (Madamba and De Jong, 1994;Davila and Mora, 2004; Patterson, 2006).
Some students entering higher education will already possess sophisticated language skills but there are many who will enter with language skills which are insufficiently developed for them to exploit their higher education opportunity to the full. It is possible to speculate about a number of reasons, which might account for this (Carey and Weiner, 2006). Some secondary school teaching has tended to reward communicative intent while de-emphasising the teaching of grammar and related language topics. In addition, popular social communication is increasingly telephone, e-mail or text-message based. The cultural background of many students tends to support oral communication rather than formal language skills.
The changing nature of the HE student body means that the stereotypical notion of a university student as being an articulate, middle class 18-21 year old, may no longer apply (McMahon, 2004; Wall, 2006). The expanding international market for UK higher education and widening participation initiatives have encouraged entry from more socially and ethnically diverse sources so that some students will come from homes where English, in its traditional form, is not the first language for them, their parents or their community. Such students may therefore need some development of their language and literacy skills in order to facilitate their achievement at university (Cownie and Addison, 1996).
Support for such language development is best embedded within the curriculum yet few academic departments will have the time, resources or expertise to work individually and intensively with students in developing language skills. An alternative is to employ writing support technology which can be used to supplement the formal support provided and can be used and accessed independently by the student when required.
To date there is very little available in the way of suitable technological support for developing language and writing competence in HE students. At University of Luton it was decided to evaluate one possible option which was a commercially available online writing tool called Criterion, which appeared to offer enhanced opportunity for technogically assisted support in academic writing. Although Criterion was developed for a different educational context, namely for North American high school and first year college students, nevertheless the features of the tool were thought to be adaptable to UK higher education students. At UoL it was intended that Criterion would be made widely available across the university if it was found to be effective and appropriate.
CRITERION
Criterion is a web based writing support tool which was developed at MIT and is marketed by Educational Testing Services. It was designed to be used with an element of teacher supervision and involvement but it allows the student to work independently and submit their work to Criterion to get immediate feedback and advice.
Criterion provides feedback to students through two language feedback systems these are based on natural language processing methods: -
The first of the Criterion feedback systems is the e-rater which provides an overall holistic score for the quality of a student’s essay. A holistic score will range between one (very poor) and six (very good). Through using the second feedback system within Criterion (discussed next) a student should be able to improve an overall assignment score that they are awarded.
The second feedback system within Criterion is the critique which is an umbrella name for five trait specific areas that Criterion provides feedback on. The trait feedback addresses areas of language error covering the areas of grammar (this addresses issues such as pronoun errors, ill formed verbs and subject-verb agreements); usage (this addresses mistakes involving, for example, confused words, preposition errors and non-standard verb or word forms); and mechanics (this addresses concerns surrounding, for example, spelling, missing comas and missing apostrophes). Criterion also provides students with feedback relating to the overall essay style (this addresses issues such as repetition of words or sentence length) and essay organisation and development (this highlights to a student where within their essay where the introductory material, thesis statement, main ideas, supporting ideas and conclusion can all be found). When a student enters their assignment into the Criterion system within seconds they will receive feedback on their assignment in relation to each of these areas.
Criterion also permits tutor access to scrutinise and comment on students’ writing as they submit each version of their work. This is done through Criterion’s archive feature automatically saving a student’s first and most recent assignment submission.
(For full details about the technological set up and evaluative effectiveness of the Criterion language programme please see Burstein, Chodorow and Leacock, 2003).
METHOD
DESIGN
This project was intended to evaluate the effectiveness of an online language development and support tool in the promotion of student written language proficiency. This investigation assumed both a quantitative and a qualitative methodology. How these approaches were utilised is discussed below.
Quantitative approach
It was originally intended that this evaluation would employ a robust experimental design. It wasplanned that approximately 250 student participants would be drawn from the Business Faculty in UoL. This was to include groups of non UK students with poor English language skills; UK students who would benefit from language support; and a mixed group, this latter group acting as a control. It was then intended that pre and post intervention language proficiency data would be collected from which inferences about the effectiveness of the language development tool could be made.
Although there was early interest from the lecturing staff involved, as time went on the participant staff were insufficiently involved to make the initial experimental design feasible. There were a number of reasons for this, around time pressures, workload and difficulties in accommodating the use of the software in their teaching time. The experimenters were forced to rethink the evaluation process and find different teaching groups to involve and alternative means of accessing students. E.g. via study support. The plan of separate experimental conditions amenable to statistical analysis was therefore modified and statistics from participant groups were pooled to give one overall sample amenable to some statistical investigation.
Qualitative approach
It was intended that quantitative evaluation would be supplemented with qualitative follow-up work. Therefore interviews and open-ended questionnaires were used to ascertain student and staff perspectives of Criterion. Qualitative exploration provided a useful insight into differences between the effectiveness of the induction processes and also valuable information around usage and perceptions of the effects that Criterion had on students’ written English.
PARTICIPANTS
Staff
In total 36 members of UoL staff participated in different aspects of the Criterion evaluation, four of whom were responsible for inducting students on to Criterion.
Students
Although access was provided for 250 students to be enrolled on to Criterion, only 78 students received detailed explanation and induction. Of those students who were inducted only 56 provided demographic data. All students that participated were drawn from across a range of subjects / disciplines.
The following is a breakdown of demographic data provided by the 56 students.
Students that participated in the Criterion project were predominately ‘Home’ (UK) students. However, 22 participants reported coming to the UoL from outside of the UK. The following table shows participant nationality:
Are you a Home (UK), European or International student?Home (UK): / / 60.7% / 34
European: / / 16.1% / 9
International: / / 23.2% / 13
Within this investigation marginally more participants (53.6%) spoke a first language that was not English. The following table details what participants described as their first language.
First Language / FrequencyEnglish / 26
Bengali / 2
Chinese / 6
Greek / 3
Yoruba / 3
Other / 16
The ‘other’ category included students who spoke, for example, ‘Afrikaans’; ‘Danish’; ‘Estonian’; ‘French’; ‘Hungarian’; ‘Japanese’; ‘Kutchi’; ‘Lithuanian’; ‘Luganda’; ‘Mandarin’; ‘Polish’; ‘Portuguese’; ‘Spanish’; ‘Swahili’; ‘Tamil’; and ‘Twi’.
The vast majority of participants (69.7%) were studying for a Bachelors qualification as can be seen in the following chart. This chart indicates the level at which participants were studying.
What is your level of study?Access Degree: / / 5.4% / 3
Bachelors (BSc, BA, LLB): / / 69.7% / 39
Masters: / / 16.1% / 9
Other (please specify): / / 8.9% / 5
Just over half of the participants (55.4%) in this investigation were in their first year of study at the UoL. The following chart indicates the proportion of UoL students in each year of study.
What is your year of study on this award?1: / / 55.4% / 31
2: / / 28.6% / 16
3: / / 14.3% / 8
4: / / 1.8% / 1
The following chart shows the majority of participants (96.4%) were studying for their qualification on a full-time basis.
Are you a full-time student or part-time?Full-time: / / 96.4% / 54
Part-time: / / 3.6% / 2
MATERIALS AND APPARATUS
Criterion
Details of the Criterion program are given above.
Questionnaires
Questionnaires were developed to ascertain student demographic information; student reflections on using Criterion; staff assessment of student language development needs; impressions of Criterion usability.
Semi structured interviews
Semi structured interviews were used to explore UoL staff perceptions of Criterion around the areas of: benefits; limitations; and usability.
PROCEDURE
Staff were introduced to and shown how to use Criterion by the UoL Criterion project officer.
Students were introduced and enrolled on to Criterion in one of two ways. (1) Students were inducted in a class wide setting by a lecturer who was willing to participate in the investigation. (2) Student were referred to Criterion by the UoL study support co-ordinator and then inducted on to the program in a small group or on a one-to-one basis by the UoL Criterion project officer.
RESULTS
The following is a brief account of preliminary findings.Continuing work is being undertakento explore how students use Criterion over an extended period of time.
UoL STUDENT LANGUAGE NEEDS
All staff who responded to an online questionnaire around student language needs affirmed that there was a need for additional written language support.
Academic staff appeared to perceive the biggest areas of student language difficulty involved - grammar, sentence construction and spelling. This is exemplified in the following comments received:
‘Grammar, including punctuation, sentence construction, and spelling’.
‘Writing in English in general - so sentence construction so what they write makes sense. They often hand in work at the last minute so don't take the time to read through to see if it makes sense or there are no typos etc. Some students miss the point of the task and thus struggle to write relevant material’.
All staff felt that there was a need to improve feedback on written work. When asked to elaborate about feedback on students’ writing three key themes emerged:
(A) Differences between lecturers - in that some will correct poor English whilst others will ignore it.
(B)Feedback on content not quality of writing is most important.
(C) The nature of a course format will affect the type and quality of feedback given.
Some respondents also commented on their own ability to give good feedback on written language.
‘I can comment on the quality of their writing (and do so), but I do not have the specialist knowledge to correct it. I am not an English specialist. Students need to understand that the quality of their written language is closely bound to the clarity of their argument’.
A number of staff felt that feedback should be given as soon as possible in order for students to utilise it.
The majority of staff respondents indicated that they could see a use for an automated written language feedback tool.
ASPECTS OF CRITERION USAGE
Usage by students after induction
Students inducted by / Total number of students inducted / Number of students who used Criterion after their inductionStaff instructors / 38 / 0
Project officer / 40 / 21
It is clear from the above table that usage of Criterion was closely linked to the nature of student enrolment and induction. Inductions carried out by the project officer were either one-to-one or small groups whereas staff instructors performed class wide inductions.
How many times students self-reported using Criterion?
This table shows how many times students reported they used Criterion whilst completing their assignments.
1 time: / / 14.3%2 - 5 times: / / 57.1%
6 - 10 times: / / 21.4%
11 - 15 times: / / 7.1%
16 times plus: / / 0.0%
This shows that the vast majority of students used Criterion more than once which would suggest that they found it to be useful. One must urge some caution with respect to interpretation of this chart due to inevitable differences surrounding length of work students entered into Criterion and also differences in student written language competence.
What did students and staff perceive to be the most useful parts of Criterion?
The following table shows what students and staff reported to be the most useful features of aspects of Criterion. (Please note some respondents raised more than one issue per category).
Issue / From students / Staff intervieweesTrait feedback categories / 5 / 2
Everything / 4 / 0
Easy to use / access / 2 / 3
Helps student personal development / 5 / 1
Helps improve quality of assignments / 8 / 0
Macro benefits of a online language support tool / 3 / 1
Other / 1 / 0
These findings identified that both students and staff perceived there to be a range of benefits associated with using Criterion.
What did students see as the most useful Criterion trait feedback categories?
Grammar: / / n/a / 10Usage: / / n/a / 8
Mechanics: / / n/a / 7
Style: / / n/a / 11
Organisation and dev: / / n/a / 7
This chart shows that students perception of their language support needs were consistent with staff perceptions (see above).
Quantitative findings
A repeated measures T-Test was used to ascertain if access to the Criterion feedback facilities impacted on holistic scores students received for their first submission of a preset Criterion assignment and their last submissions of a preset Criterion assignment. A statistically significant increase was found in holistic scores from the first submission (M=2.94, SD=.759) to the last submission (M=3.22, SD=.706, t (31) = -3.044, p = <0.005).
A repeated measures T-Test was used to determine if access to the Criterion feedback facilities impacted on students’ error coefficients (calculated by dividing total number of trait errors by total number of words) for their first submission of a preset Criterion assignment and their last submissions of a preset Criterion assignment. A statistically significant decrease was found in the co-efficient from the first submission (M=.189, SD=.122) to the last submission (M=.109, SD=.093, t (35) = 5.398, p = <0.0001).
These findings would initially suggest that Criterion helped students to improve the quality of their use of written English. However, these results should be treated cautiously for two reasons. (1) These results were obtained from a relatively small sample. (2) It is possible that the improvements are largely artifactual in that students learn to use the program to minimise errors but it is unclear whether this would translate into a more general improvement in writing. It should be noted that these results were obtained after a comparatively short period of using Criterion. Prolonged use could well produce more significant and generalisable results.
DISCUSSION
As evidenced by the views of a sample of lecturing staff there is a perceived need for writing development among their students. An electronic tool would be particularly helpful since it would encourage students to become more autonomous in developing their writing skills. A web based tool has the additional advantage that students may use it in a time and place convenient to them.
Criterion has many advantages but it may not be the best option for general implementation across an HE institution for a number of reasons, as discussed below
The cost of Criterion is a disincentive to large scale use. The costs relate to the number of students registered to use the program. For the cost to be feasible in cost benefit terms, it is necessary for each student to use the facility assiduously and to show significant improvement. It was apparent that many students were not prepared to do this. Equally it is necessary for lecturers to implement and use the facility in their courses.
Each student using the facility needs to be registered in class groups and inducted on to Criterion. While this can be done in larger numbers it is best explained and demonstrated one-to-one or in small groups. Although the interface is relatively transparent, many students reported that even though they had understood how to use the tool initially, they had forgotten some procedures when they tried to use it independently. This was particularly the case where students had left some time before coming back to use Criterion.