JCDL 2007
Education workshop, page 1
Evaluating Specific Modules in a Digital Library Curriculum
Barbara M. Wildemuth
JCDL 2007, Education Workshop, Vancouver, June 18, 2007
*** Title slide
Educational evaluation can be conducted from two perspectives
Process: Is the process of the instructional delivery and learning an effective one?
Product/outcomes: Did the learner learn something from participating in the process?
Today, I’ll focus on the evaluation methods we’re planning for the Virginia Tech/UNC collaborative project on developing a modular digital library curriculum that can be implemented in either CS or ILS graduate programs
*** Development and Evaluation Process (overview)
Spiral model of the development lifecycle
Beginning at the center
Vision/plan
Based on:
Perspectives on the research team
Input from our Advisory Board
Preliminary analysis of needs and context
Analysis of the Computing Curriculum 2001, developed by IEEE and ACM
Analysis of the syllabi of current digital library courses (both CS and ILS)
Consideration of curricular needs in CS and ILS
Consideration of student background and prior knowledge in CS and ILS
Design modules
Over the past year
Intended for use in the classroom by an instructor; not as independent learning modules
Strongly based on the DL courses now being taught
Current modules drafted:
History of digital libraries and library automation (1-b)
Architecture overviews/models (5-a)
Applications (5-b)
Information needs and relevance (6-a)
Search strategy, information seeking behavior, user modeling (6-b)
Reference services (7-b)
DL evaluation (9-c)
Evaluate via inspection (a focus for today’s discussion)
Feedback on the specific strengths and weaknesses of each module, as identified through inspection
Revise and implement modules
At UNC and Virginia Tech
At additional universities (in CS and ILS programs)
Still looking for volunteers for this fall and next spring
Evaluate in the field (the second focus for today’s discussion)
Final products: modules that can be re-used in a variety of curricular contexts
*** Overview, with “Evaluate via inspection” highlighted
First focus for today is on the procedures we’ll use for evaluation through inspection
*** Evaluate via inspection: Criteria
We’ve developed an inspection worksheet that includes the criteria listed here
We’re interested in your feedback on whether the criteria are adequate
Criteria:
Objectives: Are the objectives appropriate for the topic?
Are the objectives observable?
Will students be able to achieve the objectives, given the content in the body of knowledge?
Body of knowledge:Does the module address all areas of the topic that need to be addresses?
Will the body of knowledge enable students to achieve the objectives?
Are there any topics that are critical to add to the body of knowledge?
Are there any topics that should be removed from the body of knowledge?
Readings:Are the readings the best and most appropriate for the topic?
Are there any readings that are critical to add to the list?
Are there any readings on the list that should be removed?
Learning Activities: Are the activities appropriate for the topic?
Will students be able to accomplish the activities, given the content in the body of knowledge?
Will the activities enable students to achieve the objectives?
Can you think of any other class activities appropriate for this module?
Logistics:Is it feasible to teach the module as it is currently constructed?
Is the level of effort required in class appropriate to the scope of the body of knowledge? Prior to class?
Is the prerequisite knowledge required sufficient for students to comprehend the body of knowledge?
Overall structure of the module: Is the module well structured?
Can the topics and their corresponding resources be easily divided?
Is there a clear mapping between the objectives and the content of the body of knowledge section?
Would it be better to use a table to clearly map the objectives and the contents in the body of knowledge section?
*** Evaluate via inspection: Participants
We’re still looking for volunteers to participate in the evaluation
It will be conducted this summer and fall
Members of the project Advisory Board
Participants in the JCDL Doctoral Consortium
Doctoral students with particular expertise in DLs
Others with particular expertise
Teachers of the DL courses we’ve identified
Authors of DL articles and books in particular areas covered by specific modules
Members of the ASIST SIG on Digital Libraries
*** Evaluate in the field (overview diagram)
After the initial evaluations, the modules will be revised and be ready for implementation in the field
*** Evaluate in the field
Three types of data will be used in the evaluation
Teacher perceptions
The instructors using the modules will be individually interviewed
The interview will cover the same criteria as were used in the inspections
More valid, because based on actual experience using them modules within the context of a course
Student perceptions
Will try to disambiguate their perceptions of the modules from their perceptions of the instructor and the student-teacher interactions
Will not use a standard course evaluation questionnaire
Will use a questionnaire focused on students’ evaluations of the module content and their effort and learning during the module
Possibilities:
Snare (2000), Student Opinion Survey of the Learning Experience
22 items, Likert scale
McGorry (2003)
Perceived learning, adapted from Alavi (1994), 6 items
Student satisfaction, adapted from several studies in the later 1990’s
Questionnaires will be administered immediately after completion of each module used in a course, to capture the students’ perceptions of their learning completion of that module
Student outcomes
Student perceptions of how much they learned are only one perspective
We will also examine the work they complete during each module
Learning activities and assignments are suggested with each module
The instructor may also assign other work
We will ask instructors to provide us with the student work completed, based on whatever assignments were made
*** Development and Evaluation Process (overview)
We are hopeful that these two rounds of evaluation will provide sufficient feedback to make these modules useful to CS and ILS instructors teaching in the area of digital libraries
*** Additional information
I would like to acknowledge the other members of the project team
Ed Fox and Seungwon Yang at Virginia Tech
Jeff Pomerantz and Sanghee Oh at UNC
For more information, please visit our website