HRNS-247A Page 1

HRNS 247

Program Evaluation for Jewish Professional Leaders

Spring 2017

Thursdays, 12:30-2:00

Lown 202

Course Instructor:

Fern Chertok, 781-736-2079 / / Lown 106

Office Hours: Byappointmentbut you are always welcome to drop by Fern’s office to see if she is available to help you.

Learning Goals

Evaluation research is conducted to aid decision-making, planning and policy analysis. It focuses on the assessment and understanding of how programs and policies impact individuals and communities. Jewish organizations sponsor and use evaluation research to enhance decision-making, both to assess on-going initiatives and as part of the development of new programs. The course focuses on methodological issues, but does so in the service of the application of theory to practice. Along with a discussion of principles, exemplars will be drawn from a variety of Jewish communal policy areas.

As a result of participation in this course, students will advance their skills and knowledge in the following areas:

  • Be conversant with the theory, terminology, techniques, and practice of evaluation research.
  • Develop a working understanding of the basic concepts of evaluation research and their application to a diverse set of program and policy problems relevant to the Jewish community.
  • Learn about the development and use of logic models as tools for understanding program inputs, outputs and outcomes.
  • Understand best practices in the use of qualitative and quantitative data collection strategies.
  • Gain a working knowledge of the ethical issues involved in evaluation research with special attention to the use of informed consent, voluntary participation and protection of participant privacy.
  • Gain in the ability to critically review and utilize existing program evaluation studies to improve the effectiveness of decision-making.

Requirements

Students are expected to have completed a graduate-level statistics course and at least one graduate course relevant to the contemporary Jewish community. Course participants will be expected to attend class sessions and to participate actively in discussions.Each class will require prior preparation by critically reading assigned materials.

Success in this two-credit course (with 1½-2 hours of class time per week) is based on the expectation that students will spend a minimum of 4½ hours of study time per week in preparation for class (readings, papers, homework, etc.).

Grading and Assignments

Grading will be based on class participation and an evaluation of each of the written assignments. The most weight will be given to the Logic Model and Research Brief assignments. Following are brief descriptions of each major assignment. More complete details and instructions will be part of future handouts.

  • Constructing a Logic Model –March 2

Develop a logic model that explains a program or policy in your area of interest. You can select a real program or policy or one that you devise. First, give a brief narrative description of the problem(s) or issue(s) that the program is seeking to ameliorate or change (1 page). Then describe the theory of change upon which the program or policy is based including reference to supporting research (1 page). Using a diagram, lay out the program/policy target groups, components, actions or activities, the outputs, and the expected short and long-term outcomes (2-3 pages). Finally describe any gaps or missing links between program components and desired outcomes (1 page).

  • CITI Certification—March 23

For this assignment you will need to complete the Collaborative Institutional Training Initiative (CITI)offered online through the Brandeis Office of Research Administration. You can access the training at:

This training program will take several hours to go through and at the end you will receive a document indicating your successful completion. Email this document to us or print it out and bring to class no later than March 24.

  • Research Briefing—May 4

The task is to develop a short (3-5 page) briefing summary about the available evaluation research related to a specific intervention/innovation/issue. You will need to critically review previously conducted research with an eye toward internal, external and construct validity, summarize the major findings and indicate the priority that should be given to each study. You will also need to outline the gaps in available research.

  • Additional brief assignments (will be discussed during class):

Oral response to one assigned reading

Required Text

Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach. 7th ed. Thousand Oaks, CA: Sage Publications.

Note: Assignments are based on 7th edition, but any edition is acceptable.

Additional readings will be posted on LATTE.

Academic Integrity:Violations of University policies on academic integrity, described in Section 3 of Rights and Responsibilities, may result in failure in the course or on the assignment, and could end in suspension from the University. You are expected to be familiar with and to follow the University’s policies on academic integrity. Please see the full policies.

Disability Notice:If you are a student with a documented disability on record at Brandeis University and wish to have a reasonable accommodation made for you in this class, please see me during the first week of the semester.

HRNS-247A Page 1

Course Overview

Date / Topic / Assignment
January 19 / Intro to course
What is evaluation research/action research?
January 26 / Logic models
February 2 / Logic models continued
Building program theory
February 9 / Needs assessment
Assessing anti-Israel/antisemitic hostility on campus
February 16 / Outcome research
Quasi-experimental design
Threats to Validity
February 23 / NO CLASS (BREAK)
March 2 / Implementation research / Logic Model Assignment
March 9 / Collecting qualitative data—observation/interviews/focus groups
March 16 / Qualitative data continued
Coding and analyzing qualitative data
March 23 / Ethics in evaluation / CITI certification
March 30 / Mixed methods research
Designing surveys
April 6 / Dissemination of findings
April 13 / NO CLASS (BREAK)
April 20 / Evaluation and the policy process
April 27 / The future of evaluation in the Jewish Communal context
May 4 / Research Briefing Assignment

January 19 Intro to the Course/What are Evaluation and Action Research?

January 26 Logic Models

Rossi chapter3

W.K. Kellogg Foundation, December 2001. Logic Model Development Guide. Battle Creek, MI:

*Kaplan, Sue A. and Garrett, Katherine E. (2005). The use of logic models by community-based initiatives. Evaluation and Program Planning, 28(2): 167-172.

February 2 Logic Models Continued/Building Program Theory

Rossi chapter 5

*Chertok, F., Samuel, N., & Saxe, L. (2006). You shall tell your children: An evaluation of the ICHEIC Service Corps. Waltham, MA: Cohen Center for Modern Jewish Studies, Brandeis University.

*Sasson, T., Mittelberg, D., Hecht, S. & Saxe, L.(2011). Guest-host encounters in Diaspora-heritage tourism: The Taglit-Birthright Israel mifgash (encounter). Diaspora, Indigenous, and Minority Education, 5:3, 178-197.

February 9 Needs Assessment/Assessing Anti-Israel and Antisemitic Hostility on Campus

Rossi chapter 4

Saxe, L., Wright, G., Hecht, S., Shain, M., Sasson, T. & Chertok, F. (2016). Hotspots of antisemitism and anti-Israel hostility on US campuses. Waltham, MA: Steinhardt Social Research Institute.

*Shain, M., Chertok, F., Wright, G., Hecht, S., Koren, A., Gelles, R.J. & Saxe, L. (2016). Diversity, pressure and divisions on the University of Pennsylvania campus. Waltham, MA: Steinhardt Social Research Institute.

February 16 Outcome Research/Quasi-experimental Design/Threats to Validity

Rossi Chapters 7 & 9

Bamberger, M., Rugh, J., Church, M., & Fort, L. (2004). Shoestring evaluation: Designing impact evaluations under budget, time and data constraints. American Journal of Evaluation, 25, 5-37.

*Saxe, L., Sasson, T., Phillips, B., Hecht, S., & Wright, G. (2007). Taglit-Birthright Israel evaluation: 2007: North American cohorts. Waltham, MA: Cohen Center for Modern Jewish Studies, Brandeis University.

* Sherwin, J., Irie, E., Orenstein, N., Wilcox, S., & Gattozzi, E.A. (2013). The people of the book: An evaluation of the PJ Library program. Berkeley, CA: Informing Change.

*Birkeland, S., Murphy-Graham, E. & Weiss, C. (2005). Good reasons for ignoring good evaluation: The case of the drug abuse resistance education (D.A.R.E.) program. Evaluation and Program Planning, 28, 247-256.

*UJA-Federation of New York (2016). Insights and strategies for engaging Jewish Millennials.

March 2 Implementation Research

Rossi Chapter 6

Durlak, J. & DuPre, E.P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350.

*Chertok, F., Tobias, J., Boxer, M., & Rosin, S. (Winter/Spring 2012). Opening the black box: Lessons from research on immersive Jewish service-learning programs for young adults. Journal of Jewish Communal Service, 87(1/2), 31-43.

*Chertok, F. & Koren, A. (2014). Understanding the Israel Fellows Program: Program Theory and Implementation Challenges. Waltham, MA: Cohen Center for Modern Jewish Studies, Brandeis University.

*SRI Education. (March 2014). Research on the Use of Khan Academy in Schools.

March 9 Collecting Qualitative Data—Observation/Interviews/Focus groups

Maxwell, J. A. (2009). Designing a qualitative study. In Bickman & Rog (eds), Handbook of applied social research methods (pp. 182-213). Thousand Oaks, CA: Sage.

Patton, M.Q. (1999). Enhancing the quality and credibility of qualitative analysis. Health Services Research, 34(5), 1189-1208

*Rector-Aranda, A. & Rader-Roth, M. (2015). ‘I finally felt like I had power’: Student agency and voice in an online and classroom-based role-play simulation. Research in Learning Technology, 23(1).

March 16 Qualitative Methods Continued/Coding and Analyzing

Sesno, F. (2017). Ask More: The Power of Questions to Open Doors, Uncover Solutions, and Spark Change. New York, NY: AMACOM. Chapter 4 & 5

Lofland, J. & Lofland, L.H. (1995). Analyzing social settings: A guide to qualitative observation and analysis. Belmont, MA: Wadsworth Publishing Company. Chapter 4 & 5.

*Belzer, T. (2009). 2009 March of the Living Los Angeles Delegation: Report 1: Ethnography of the Journal.

March 23 Ethics in Evaluation

Rossi p 404-411

Brydon-Miller, M., Rector-Aranda, A. & Stevens, D.M. (2015). Widening the circle: Ethical

reflection in action research and the practice of structured ethical reflection. In H. Bradbury (Ed.) Sage Handbook of Action Research (Third edition). 596-607

Collaborative Institutional Training Initiative (CITI)offered online through the Brandeis Office of Research Administration:

*McMurtrie, B. (January 26, 2014). Secrets from Belfast. Chronicle of Higher Education.

*Shaw, 2003. Ethics in Qualitative Research and Evaluation. Journal of Social Work. 3(1): 9-29.

March 30 Mixed Methods/ More on Designing Surveys

Bledsoe, K. L. & Graham, J. A. (2005). The use of multiple evaluation approaches in program evaluation. American Journal of Evaluation, 26, 302-319.

* Koren, A., Samuel, N., Boxer, M., Aitan, E. (2013). Teaching and Learning about Israel: Assessing the Impact of Israeli Faculty on American Students. Waltham, MA: Cohen Center for Modern Jewish Studies, Brandeis University.

*Cohen, S.M., et al., "Assessing the Impact of Senior Jewish Educators and Campus Entrepreneurs Initiative Interns on the Jewish Engagement of College Students", 2010

April 6 Dissemination of Findings

Rossi p369-392

Weiss, C.H. (1998). Have we learned anything new about the use of evaluation? American Journal of Evaluation, 19(1), 21-33

*Saxe, L. & Chertok, F. (2016) URJ Blog about Millennial children of intermarriage

*Chertok, F. (2015). The Power of Research on Birthright Israel: Millennial Children of Intermarriage. Presentation to the Birthright Israel Foundation Conference

April 20 Evaluation in the Policy Process

Rossi p411-418

Weiss, C.H. & Bucuvalas, M.J. (1980). Truth tests and utility tests: Decision-makers’ frames of reference for social science research. American Sociological Review, 45(2), 302-313.

*Chertok, F. & Parmer, D. (2013). Living on the Edge: Economic Insecurity among Jewish Households in Greater Rhode Island. Waltham, MA: Cohen Center for Modern Jewish Studies, Brandeis University.

April 27 The Future of Evaluation in the Jewish Communal Context