Pilot study in Computer Literacy of Word Processing for incoming college students

Dr. Dan Bennett

Edinboro University of Pennsylvania

Mr. David Tucker

Edinboro University of Pennsylvania

ABSTRACT

This study attempts to determine if a college freshman computer skills class emphasizing the proper use of a word processor such as Microsoft® Word is needed. We have administered a test involving document formatting tasks ranging from simple to complex. The results of this test indicate that while most students can perform simple tasks, few can perform tasks required for preparing a term paper, and that a minority of students fully employ Microsoft® Word to automatically accomplish these tasks. Therefore if long documents such as term papers are required, word processing skills still need to be taught.

KEY WORDS

Computer skills assessment, Computer Literacy.

1. Introduction

In this study we are interested in addressing two basic questions: (A) Do students who have not taken a computer skills class at the college have the ability to perform word processing tasks commonly associated with the production of a term paper? (B) Do these students fully utilize the features of Microsoft® Word which automate such tasks? For example, can they format a document, and will they build a bibliography by hand? In answering these questions we will be able to make more informed curriculum decisions for course design, student placement and general education requirements.

There have been several studies relating to the skills of incoming freshmen on college campuses. One study conducted a general survey of the types of technology, and skills using that technology of students in Freshman Seminar classes [1]. In this study, students self-identified their ability to employ Microsoft Office Products. Another study investigated the ability of freshman business students to complete tasks involving Microsoft Office products by asking the students to take a placement exam [2]. This study focused on basic skills and determined that for a significant portion of the population, a computer skills class is necessary. Wallace and Clariana [3] studied the difference between students’ actual ability and their perceived ability with basic computer skills. In this study 140 incoming business freshmen were tested for both skills in Excel and computer concepts. It was found that only 36% of the participants possessed a minimum skill level considered necessary for success in the business program.

2. Background

Edinboro University general education requirements include a skills section. Classes in this section are to be taken within the first 45 hours of studies and are intended to provide students with the skills needed to successfully complete college level classes. This category includes English, Mathematics and Computer skills. The focus of the Computer skills requirement is to ensure that students can adequately employ office productivity tools to accomplish tasks, such as formatting term papers, required for college success. Though many departments offer qualifying computer skills courses, the Mathematics and Computer Science department offers the course most frequently taken to fulfill this requirement. In the past five semesters half of all students taking a computer competency course took the Computer Science skills course of the ten that are offered each semester [4].

In the 2011 academic year, the university wide curriculum committee was involved in critically examining the general education requirements for Edinboro University students. In a number of open meetings the possibility of removing the computer skills requirement was discussed. These discussions centered on a belief that students are computer literate and possess the needed computer skills for success in college and beyond. The purpose of this study is to either confirm or refute this proposition as it relates to word processing. In particular we wish to determine if students can employ Microsoft® Word to format a small report. Furthermore, we wish to discover if students are accomplishing the tasks by hand or employing the more efficient and practical methods provided by the software.

The study’s assessment tool solution is 9 pages long and therefore it is possible to complete tasks such as creating a table of contents manually, however the instructions emphasized the use of automated tools. The instructions included “We are attempting to determine the level to which you use the automated tools within Microsoft® Word so please use these tools to accomplish the following tasks.”

This study examines if students have proficiency to create and maintain a large scale word processing project such as a fifty page or greater term paper that requires many edits and updates before the final paper is complete. In such a paper a table of contents, foot notes and citation management, for example, would be very difficult and time consuming to create and manage by hand[5].

3. Methodology

A word processing skills assessment was created which required the participants to format a document using Microsoft® Word. We chose not to use an automated assessment tool such as MyIT® Lab or SAM® 2010 that simulates the Microsoft® Word environment due to our experience with using these tools in the classroom anticipating it may introduce a bias in the results from the tool not working correctly and the participants becoming frustrated. The assessment tasks were chosen from our own experiences in teaching and assessing a computer literacy course, and informed by tasks used in similar studies [6]. Furthermore all tasks selected were related to the production of a term paper.

The assessment included 16 fundamental areas that measure skills from basic to complex. Tasks classified as simple can all be performed by highlighting text and pressing a button on the home tab. This includes changing font to bold or italic, double spacing text and justifying a paragraph. Medium tasks involve a button press, not necessarily on the home tab, and additional text entry, or additional knowledge of the implications of using a particular feature. For this study the creation of a title page, inserting a footnote and changing section headings were considered medium tasks. Complex tasks involved multiple step procedures, or knowledge of advanced formatting standards. These tasks included building a table of contents, a bibliography, inserting and using a citation, and numbering pages.

Of the 16 measures tested, we evaluated 14 for this pilot study. The tasks were selected from all three difficulty levels to ensure that all skill levels were measured. The grouped tasks assessed are listed in Table 1 along with the results. The assessment was designed to take no more than 50 minutes. This allowed the assessment to be administered during any standard class time.

A request was sent out to the entire Computer Science faculty teaching the Computer skills course (CSCI 104) in the spring of 2011, face-to-face on the main campus. 100% of the faculty responded that they were willing to administer this assessment in their Computer Literacy course within the first week of the course before any instruction in word processing has occurred. This course was chosen because it was likely we would get a large percentage of recent high school graduates and this course is taught in a computer lab giving us a convenient way to administer the assessment. In addition, students in the Computer Skills course come from a variety of majors which provided a reasonable representation of student body.

Each student was given an instruction sheet that included a unique identification number, a web address linking to the assessment tool, and a short demographic questionnaire. The web page contained an unformatted report, formatting instructions and screen shots of some completed tasks. Students were instructed to download the report, complete the formatting tasks and, when finished, e-mail their formatted report to a specified e-mail address. Students were further instructed to seek assistance in tasks such as accessing the web page, saving the file, starting the word processor, and emailing the completed report. Space was provided on the instruction sheet for proctors to note when such assistance was rendered. Finally, students were asked to employ automated tools whenever possible to accomplish the specified tasks.

Each of the nine sections of CSCI 104 is limited to 40 students, or 360 possible responses. We received 337 responses with demographics returned completed. Students were requested to email the final formatted report to an email address, a task completed by 293. Students were instructed to include the unique identification number from the instruction sheet in their report as well as in the email, a task which 260 were able to perform successfully. The remaining 33 reports contained the example identification number from the instructions, the student’s university identification number, or some other invalid identification. These tests were given a new, unique identification number and stored.

Sixty of the responses were randomly selected from the 293 reports submitted and graded for this pilot study. To ensure consistency in keying the data, the graders collaborated when creating the rubric, which was designed so that the results were keyed with a Y or N for tasks such as justifying a paragraph. Five tasks were graded as correct, partially correct of incorrect, and included skills such as creation of a table of contents. The remaining six tasks were judged as being accomplished by hand or by using a built in tool, including tasks such as building a bibliography. Graders worked together on evaluation standards for each measure and graded a sample assessment thereby resolving any keying inconsistencies.

Demographic data was also collected at the time of assessment. Each student was asked to provide gender, age group, high school attended, if they had word processing in high school, if they had any college course that teaches word processing and to self-evaluate their skill level using Microsoft® Word.

3. Conclusion

3.1 Summary of Findings

3.1.1 Demographics

Eight of the sixty selected reports did not have a correct identification number, one failed to submit a form and one submitted a blank form. Therefore the demographics for this pilot study are based upon 50 responses. Of these, 29 were identified as male and 19 as female. 38 reported having taken word processing in high school, 10 had not and one was unsure. Three students reported having taken a class which taught word processing at the college level.

Respondents expressed a high level of confidence in their ability to use Microsoft® Word. When asked “On a scale of 1 to 5, how well do you know Microsoft® Word? (1-Never Used, 5-Expert)” students in this group gave an average response of 3.6. This includes one respondent rating themself as an expert, 27 reporting a score of 4, 14 a score of 3 and three students reporting a score of 2 or 1. This demographic is slightly higher than the 3.4 average of all submitted forms. This is similar to other studies, which found a perceived degree of proficiency of 3.988 on a five point Likert-scale where one is negligible or no skill and 5 is expert skill [6].

3.1.2 Task Completion

Table 1 shows a summary of our findings. On average 89% of the students could complete the easy category of tasks. The one task with exceptionally low completion rate was the ability to fully justify a paragraph. This dropped the easy task completion rate by 8%. Based on our experience teaching computer literacy, we suspect this is from the participants not knowing what full justification means.

Level of Difficulty / Task / % Correct
Easy / Make text bold / 98%
Justify a paragraph / 63%
Italicize text / 97%
Double Space text / 97%
Medium / Correctly Insert Footnote / 58%
Build Title Page / 88%
Section Headings / 83%
Hard / Table Of Contents / 68%
Insert Citation into database / 35%
Correctly Set reference format / 13%
Build a Bibliography / 42%
Cite a statement in the paper / 27%
Use Page Numbers / 38%
Use of Section Break for Page Numbering / 2%

Table 1: Summary of Task Completion Rate

There is a decrease of 12% for the task completion from the easy to the medium difficulty category. The average for the medium category is 77%. With the exception of the use of footnotes, completion rates were in the eighties. We noticed that of the participants who successfully added some sort of in-text referencing, 12% made endnotes instead of the required footnote.

Within the category considered hard we noticed a wide distribution of completion rates ranging from 68% to 2%. The average in this category is 32%, a 45% drop in completion rate from the medium level and a 57% drop from the easy category. A surprising 68% created a table of contents but no other task ranked above 50%. Only 2% were able to successfully number the pages. This task required the participant to make a section break in the document; de-link the sections allowing them to have different page number formatting for each section. We gave credit for page number completion if they had any form of page number on all the pages which 38% of the respondents had. Another skill to note is the students’ inability to correctly cite a source for the paper. The students were asked to enter given citation information into the document’s citation manager, cite the correct location and generate a bibliography using the MLA 6th style standard, at the end of the document. Only 13% could correctly complete this task.

Within the group of tasks considered easy, the students overwhelmingly could complete the tasks. Problems arose when the students were faced with the medium to hard tasks. Even though at this part of the study the researchers were looking at just the ability to complete the tasks, not considering usage of tool or if done manually, it is apparent that more instruction is needed in these areas if a goal of mastery is desired.