Assessment Report, Information Literacy Project
Joanna Tillson, Elena Bianco, and Gary Park
6-21-02

In this project we launched a pilot assessment related to some of the college's new Information Literacy (IL) General Education outcomes. Although we did not collect as much information to analyze as was hoped, the project was successful in that we developed a survey instrument and tested implementation and reporting processes that can be used as starting points for future information literacy projects.

The first step involved collaboration among ourselves to determine which IL outcomes to assess. We selected a set of outcomes related primarily to identifying and accessing relevant resources, the kinds of outcomes taught in classes such as Eng 102, Library 150, as well as the library orientation given by librarian faculty.

Next, we devised a 16 question survey, with questions in multiple choice and short answer format that could be used to assess student knowledge and learning in these outcomes. This survey was created in Word format and was then placed on the Blackboard Course Management System's survey function. The Blackboard mounting allowed us a user-friendly web-based interface for students as well as the ability to collect quick percentage information on responses to multiple choice questions.

Many faculty feel inundated with survey requests as well as other information, so the response to our invitation to (voluntarily) take part in the information literacy survey, broadcast to all Eng 102 instructors, produced responses from only faculty. However, these three participants (Parks, Murphy, and Kerns) allowed survey implementation in four sections of English 102, one of them an online course.

Meanwhile, Elena Bianco was able to implement the survey in a Health Occupations class that she visited for a library orientation. This left us with five different classes to pilot the survey in. However, because of inconsistencies in faculty participation, pre and post (i.e., early quarter and late quarter) information was only collected in three classes (two of Parks, and Kerns).

Percentage of responses for multiple choice questions is available immediately for two sections, the ones that used Blackboard (Parks), and we will compare pre and post data for these courses. We also devised a point-based scoring rubric for the verbal answer questions and have implemented it for the two sections surveyed on Blackboard. We have already discussed these preliminary findings among ourselves, and will report/discuss in the appropriate programs in the fall (English and LMC) without, however, the illusion these sections speak to the general campus situation.

Jim James of the Office of Institutional Research and Assessment is working with us to set up a data collection template for the paper-based surveys, and we will report these data as available.

Based on our discussions of initial findings and responses, as well as information provided by students in a final "feedback" questions offered on the survey, we have drafted some revised survey questions to be used in the next phase of the project.

The most important accomplishments of this project were to create and pilot the survey, to revise survey questions based on findings, to generate preliminary findings that will be shared as such with appropriate programs, and to identify issues to resolve, such as the hurdle in gaining wider faculty participation in relevant classes.

If you have any further questions, please contact Gary Parks at gparksQctc.edu.