IST@IUB

R561: Evaluation in the Instructional Development Process (Section 6662)

Spring 2010

Class Time: Thursdays at 4:00 – 6:45 pm (1/14 – 4/29) (No class - 3/18, 4/1 & 4/22)

Tuesdays at 4:00 – 5:15 pm (Mar 2, Apr 6 & 27)

Location: Education 2275

Instructor:Dr. Yonjoo Cho (Education 2232)

Office Hours: 1:30 – 3:00 pm Tuesdays(appointment requested for other times)

Communications:; 812-856-8144

TAs:Yeol Huh () and Dabae Lee ()

Course Access:

Course Description

R561 centers on evaluation as an integral element of the instructional technology (IT), human performance technology (HPT), and human resource development(HRD) processes. Training, performance improvement, and HR professionals need information about the impact and effectiveness of programs in terms of: (1) the degree to which program results achieve intended objectives, (2) whether or not results are desirable, and (3) evidence that results are achieved in cost-effective manners. Principles and methods for evaluating instructional and performance improvement programs during the stages of analysis, design, development, implementation, and utilization are covered. Frameworks and models for planning and conducting evaluations are discussed and applied.

Course Objectives

At the completion of the course,students will be able to:

  1. Understand basic concepts and terminology associated with instructional, performance improvement, and HRDevaluation.
  2. Explain the purposes and uses of evaluation within different instructional, performance improvement, and HRD environments.
  3. Use qualitative and quantitative data collection techniques in evaluation activities.
  4. Analyze and interpret evaluation data and information.
  5. Report the results of evaluation activities.

Course Outline

The course is divided into six units to reflect the importance of major evaluation perspectives.

Unit 1: Basics of Evaluation

Develop common understandings of basic concepts and definitions, underlying principles and theories, and perspectives of the field of evaluation.

Units 2 to 5: Four Levels of Evaluation

Address four well known perspectives for evaluating instructional, performance improvement, and HRD:

  • Gain knowledge of purposes, concepts, frameworks, theories, cases, and major issues in using different levels of evaluation in varied types of organizations.
  • Gain skills of developing instruments required for data collection in each level of evaluation, including interview questions, questionnaire, test items, observation checklist, and ROI.
  • Gain skills of analyzing collected data using both qualitative and quantitative methods.
  • Gainfield work experience by applying knowledge and practical skills to the varied organizations including business, education, military, non-profit organizations, and simulated situations.

Unit 6: Evaluation Synthesis

Create an evaluation framework that is applicable to a variety of organizational settings. This is developed through an action learning approach to the student group’s fieldwork evaluation project.

Course Requirements

Participation

Students are expected to: (1)submitbulletdiscussion points (at least one on a required reading) by Thursdays at 9:00 am and (2) actively participatein the classactivities such as watching and discussing two movies (12 Angry Men and To Kill a Mockingbird). The instructor and TAs will assign grades by reviewing the quality ANDthe quantity of the participation.

Unit Exercises

Students will complete four unit exercises including levels one to four evaluation: survey, test development, transfer, and impact. Each exercise will be worth 10 points, assessing students’ abilities to analyze and synthesize what they have learned in each unit. Specific details will be provided in class.

Case Study Presentation

Teams of two students will present on a case study of evaluating instructional, performance improvement, or HRD programs practicedin various organizations. In their presentations, students are required to show their mastery of basic concepts and knowledge of evaluation.

Final Project

Students will work as a team to completea project on one of the four choices below:

1.A comprehensive organization-specific evaluation project

2.A predetermined organization-specific evaluation project (e.g., IST’s Distance Education)

3.An evaluation of pre-existing evaluation programs:

  1. quality awards (e.g. Malcolm Baldrige Award)
  2. company-specific evaluation programs (e.g. Session-C in GE)and
  3. accreditations (e.g. AACSB)
  4. performance standards (e.g. ISPI’s CPT standards)

4.An EXTENSIVE literature review of a research topic. No more than two students can work on this. This is highly recommended to doctoral students.

Reflection Paper

Students will write a (single-spaced three pages) reflection paper. The purpose of this assignment is to help students contemplate regarding what they have learnedin the course in the contexts of prior and present experiences as well as their future plans.

Grading Criteria and Due Dates

  1. Weekly Participation
/ 20
  1. Unit Exercises (Units 2 – 5)
/ 40 (10 x 4) / 2/11, 3/8, 3/25, 4/15
  1. Case Study Presentation
/ 10 / Your group’s choice
  1. Final Project and Presentation
/ 20 / 4/27
  1. Reflection Paper
/ 10 / 4/29
100 points

Grading Policy

The following grading policy has been adopted for graduate courses in the School of Education ( The percentages in the parentheses were added by the instructor.

A (95-99%) / = / Outstanding achievement. Unusually complete command of the course content.
A- (90-94%) / = / Excellent achievement. Very thorough command of course content.
B+ (86-89%) / = / Very good achievement. Thorough command of course material.
B (83-85%) / = / Good achievement. Solid, acceptable performance.
B- (80-82%) / = / Fair achievement. Acceptable performance.
C+ (76-79%) / = / Not wholly satisfactory achievement. Marginal performance on the course requirements.
C (73-75%) / = / Marginal achievement. Minimally acceptable performance on course assignments.
C - (70-72%) / = / Courses with a grade of C- or lower may not be counted in graduate programs.

Be certain that you understand the evaluation criteria before you begin any of the projects. The evaluation guideline sheets (to be distributed in class) are both the criteria for the projects and checklist for contents to be included in all project assignments.

Plagiarism and Original Work

We expect that you will turn in original work (your own or that of your team) for every part of every deliverable in this course. We also expect that you make every effort to acquaint yourself with both the IU Code of Student Rights, Responsibilities and Conduct, the concept of plagiarism (start with the required departmental tutorial "Understanding Plagiarism"), and the ways in which you must both credit the work of others and avoid presenting that work as your own (start with the resources from the Campus Writing Program and reference the APA style guide).

Team project work containing plagiarized material will be awarded a grade of F. At the discretion of the instructor, the project may be turned back to the team for correction of the problem before a specified deadline and re-graded for a grade equivalent to or lower than the grade the project would have otherwise received. If your individual work is discovered to be plagiarized or to contain plagiarized material, you will receive a failing grade for the course. These policies cover written and graphical work, and all work assigned in the course.

Required Textbook

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.). San Francisco: Berrett-Koehler.

Recommended (Optional)

Pershing, J. A. (Ed.) (2006). Handbook of human performance technology(3rd ed.). SF: Pfeiffer.

Required Readings

Week 2

Cho, Y., Park, S., Jo, S. J., Jeung, C.-W., & Lim, D. H. (2009). Developing an integrative evaluation framework for e-learning. In V. C. X. Wang (Ed.), Handbook of research on e-learning applications for career and technical education (pp. 707-722). Hershey, PA: IGI Global.

Patton, M. Q. (2008). Utilization-focused evaluation: Processes and premises. In Utilization-focused evaluation (4th ed.) (pp. 559-582). Thousand Oaks, CA: SAGE.

Werner, J. M., & DeSimone, R. L. (2009). Evaluating HRD programs. In Human resource development(5e) (pp. 196-234). Mason, OH: South-Western.

Week 2 (Optional)

Baldwin, T. T., & Ford, J. K. (1988). Transfer of training: A review and directions for future research. Personnel Psychology, 41, 63-105.

Holton, E. F. III. (2005). Holton’s evaluation model: New evidence and construct elaborations. Advances in Developing Human Resources, 7, 37-54.

Week 3

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.) (chapters 3-4). San Francisco: Berrett-Koehler.

Lee, S. H. (2006). Constructing effective questionnaires. In J. A. Pershing (Ed.), Handbook of human performance technology(3rd ed.) (pp.760-779). San Francisco: Pfeiffer.

Ritter, L. A., & Sue, V. M. (2007). Introduction to using online surveys. New Directions for evaluation, 115, 5-14.

Week 4

Gilmore, E. R. (2006). Using content analysis in human performance technology. In J. A. Pershing (Ed.), Handbook of human performance technology(3rd ed.) (pp.819-836). San Francisco: Pfeiffer.

Thomas, M. N. (2006). Quantitative data analyses. InJ. A. Pershing (Ed.), Handbook of human performance technology(3rd ed.)(pp. 837-872). San Francisco: Pfeiffer.

Week 5

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.) (chapter 5). San Francisco: Berrett-Koehler.

Morrison, G. R., Ross, S. M., Kemp, J. E., & Kalman, H. K. (2007). Designing effective instruction (5th ed.) (chapters 5 & 11). Hoboken, NJ: John Wiley & Sons.

Week 6

Shrock, S., & Coscarelli, W. (2007). Criterion-referenced test development (2nd ed.) (chapters 7-8). SF: Pfeiffer.

Week 7 (No class – I will present my study at the 2010 Academy of HRD Conference in Knoxville, TN.)

Week 8

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.) (chapter 6). San Francisco: Berrett-Koehler.

Pershing, J. L. (2006). Interviewing to analyze and evaluate human performance technology. In J. A. Pershing (Ed.), Handbook of human performance technology(3rd ed.) (pp.780-794). SF: Pfeiffer.

Week 9

Pershing, J. A., Warren, S. J., & Rowe, D. T. (2006). Observation methods for human performance technology. In J. A. Pershing (Ed.), Handbook of human performance technology(3rd ed.) (pp. 795-818). SF: Pfeiffer.

Week 10 (Spring Break – Enjoy!)

Week 11

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.) (chapters 7-8). San Francisco: Berrett-Koehler.

Parry, S. B. (1996). Measuring training’s ROI. Training & Development, 50(5), 72-77.

Phillips, J. A., & Phillips, P. P. (2010). The business of program evaluation: ROI. In J. L. Moseley & J. C. Dessinger (Eds.),Handbook of improving performance in the workplace: Measurement and evaluation(pp. 219-239). Silver Spring, MA: ISPI.

Week 11 (Optional)

Phillips, J. A., & Phillips, P. P. (2007). Project costs and calculating ROI. In Show me the money: How to determine ROI in people, projects, and programs (pp. 183-197). Houston, TX: Gulf.

Russ-Eft, D., & Preskill, H. (2005). In search of the holy grail: Return on investment evaluation in human resource development. Advances in Developing Human Resources, 7, 71-85.

Week 12 (No class – I will present my study at the International Action Learning Conference in U.K.)

Week 13

Chevalier, R. (2010). The changing role of evaluators and evaluation. In J. L. Moseley & J. C. Dessinger (Eds.),Handbook of improving performance in the workplace: Measurement and evaluation(pp. 354-374). Silver Spring, MA: ISPI.

Kaplan, R. S., & Norton, D. P. (1992). The balanced scorecard-Measures that drive performance. HBR, 70(1), 71-79.

Parsons, J. G. (1997). Values as a vital supplement to the use of financial analysis in HRD. Human Resource Development Quarterly, 8, 5-13.

Week 14

Greene, J. C. (2007). Contested spaces: Paradigms and practice in mixed methods social inquiry. In Mixed methods in social inquiry (pp. 49-65). San Francisco: Jossey-Bass.

Russ-Eft, D., & Preskill, H. (2008). Improving the quality of evaluation participants: a meta-evaluation. Human Resources Development International, 11, 35-50.

Schwandt, T. A. (2007). Expanding the conversation on evaluation ethics. Evaluation and Program Planning, 30, 400-403.

Week 14 (Optional)

Ostrom, E., & Nagendra, H. (2006). Insights on linking forests, trees, and people from the air, on the ground, and in the laboratory. PNAS, 103(51), 19224-19231.

Case Studies

Arthur Jr., W. A., Bennett, W. Jr., Edens, P. S., & Bell, S. T.(2003). Effectiveness of training in organizations: A meta-analysis of design and evaluation features. Journal of Applied Psychology, 88(2), 234-245.

Brinkerhoff, R. O. (2005). The success case method: A strategic evaluation approach to increasing the value and effect of training. Advances in Developing Human Resources, 7, 86-101.

Goodall, A. H. (2009). Highly cited leaders and the performance of research universities. Research Policy, 38, 1079-1092.

Kellogg, D. L., & Smith, M. A. (2009). Student-to-student interaction revisited: A case study of working adult business students in online courses. Decision Science Journal of Innovative Education, 7, 433-456.

Lee, Y.-F., Altschuld, J. W., & Hung H.-L. (2008). Practices and challenges in educational program evaluation in the Asia-Pacific region. Evaluation and Program Planning, 31, 368-375.

Mabry, L. (2008). Consequences of No Child Left Behind on evaluation purpose, design, and impact. In T. Berry & R. M. Eddy (Eds.), Consequences of No Child Left Behind for educational evaluation. New Directions for Evaluation, 117, 21-36.

Nemanich, L., Banks, M., & Vera, D. (2009). Enhancing knowledge transfer in classroom versus online settings: The interplay among instructor, student, content, and context. Decision Science Journal of Innovative Education, 7, 123-148.

Rodriguez, H., Trainor, J., & Quarantelli, E. L. (2006). Rising to the challenges of a catastrophe: The emergent and prosocial behavior following Hurricane Katrina. The Annals of the American Academy of Political and Social Science, 604, 82-101.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology, 59, 623-664.

Sleezer, C. M., & Spector, M. (2006). Assessing training needs of HIV program providers: A mixed methods evaluation. Performance Improvement Quarterly, 19(3), 89-105.

Tushman, M. L., O’Reilly, C. A., Fenollosa, A., Kleinbaum, A. M., & McGrath, D. (2007). Relevance and rigor: Executive education as a lever in shaping practice and research. Academy of Management Learning & Education, 6(3), 345-362.

R561 Course Schedule

Intro: Course Introduction and Overview

Unit 1: Basics of EvaluationUnit 4: Transfer-Level 3 Evaluation

Unit 2: Reaction-Level 1 EvaluationUnit 5: Impact-Level 4 Evaluation

Unit 3: Learning-Level 2 EvaluationUnit 6: Evaluation Synthesis

Unit / Week / Topic / Readings / Assignments
Intro / 1
(Jan 14) / Personal Introductions
Course Introduction
Movie 1: 12 Angry Men
U1 / 2
(Jan 21) / Overview
Basics of Evaluation:
• concepts and definitions
• models and frameworks / Cho et al. (2009)
Patton (2008)
Werner & DeSimone (2009)
U2 / 3
(Jan 28) / Reaction: Level 1 Evaluation
• purposes and uses
• developing questions – dos and don’ts (Yeol) / Kirkpatrick (chs3 & 4)
Lee (2006)
Ritter & Sue (2007) / Case study 1
4
(Feb 4) / Reaction: Level 1 Evaluation
• assessing reactionnaire
• reactionnaire data analysis
Case study presentation 1 / Gilmore (2006)
Thomas (2006)
Case study 1 / Case study 2
U3 / 5
(Feb 11) / Learning: Level 2 Evaluation
• fundamentals of testing
• instructional objectives
Case study presentation 2 / Kirkpatrick (ch 5)
Morrison et al. (chs 5 & 11)
Case study 2 / • U2 Exercise due
• Case study 3
6
(Feb 18) / Learning: Level 2 Evaluation
• practical tests
• learning data analysis
Case study presentation 3 / Shrock & Coscarelli (chs 7-8)
Case study 3 / • Case study 4
• One-page proposal
7
(Feb 25) / Movie 2: To Kill a Mockingbird
(I will be presenting on two studies at the 2010 AHRD Conference)
U4 / 8
(Tue)
(Mar 2) / Final project: one-page proposal
(team-based clinic)
8
(Mar 4) / Transfer: Level 3 Evaluation
• interview
Case study presentation 4 / Kirkpatrick (ch 6)
Pershing (2006)
Case study 4 / • U3 Exercise due
• Case study 5
9
(Mar 11) / Transfer: Level 3 Evaluation
• observation
Case study presentation 5 / Pershing et al. (2006)
Case study 5
10
(Mar 18) / Spring Break – ENJOY!
U5 / 11
(Mar 25) / Impact: Level 4 Evaluation
• implementing four levels
• ROI – how to calculate (Dabae) / Kirkpatrick (chs 7-8)
Parry (1996)
Phillips & Phillips (2010) / U4 Exercise due
12
(Apr 1) / International Action Learning Conference:
I will be presenting on an action learning study in Henry-on-Thames, UK
13
(Tue)
(Apr 6) / Final project: pilot
(team-based clinic)
13
(Apr 8) / Impact: Level 4 Evaluation
• BSC
• values / Chevalier (2010)
Kaplan & Norton (1992)
Parsons (1997)
U6 / 14
(Apr 15) / Evaluation Synthesis
• meta-evaluation
• mixed methods
• evaluation ethics / Greene (2007)
Russ-Eft & Perskill (2008)
Schwandt (2007) / U5 Exercise due
Wrap-Up / 15
(Apr 22) / Final project in progress
(no class - team-based work)
16
(Tue)
(Apr 27) / Final Project Presentations / Final project due
16
(Apr 29) / Final Project Presentations (cont’d)
Reflections
Let’s celebrate! / • Reflection paper
• Online course evaluation

END

1