1

PART I:Grant Application Form

Title of the project:Advancing Ongoing Program Review as Proactive Pedagogy by

Creating Learning Communities that Foster Assessment

Institution requesting the grant: Saint LouisUniversity

Name of project director:

Jay M. Hammond

Assistant Professor

Department of Theological Studies

HumanitiesBuilding #124Phone: (314) 977-2986

3800 Lindell BoulevardFAX: (314) 977-2947

St. Louis, MO63108-3414E-mail:

Amount of the grant request: $70,000

Time period to be covered by the grant: Three years (09-01-05 to 08-31-08)

List other sources of funding for this project to which you have made application:

In August DTS will submit a three year ~$200,000 NEH materials development grant proposal for the department’s new Institute for Digital Theology. While the general purposes of the two proposals do not overlap precisely, there is substantial overlap in the area of electronic delivery of undergraduate and graduate assessment and evaluation tools that serve theological education. Sponsor: Grants for Teaching and Learning Resources and Curriculum Development, National Foundation for the Arts and Humanities, National Endowment for the Humanities, Division of Education Programs

How did you learn of this program? I first learned of the WabashCenter for Teaching and Learning in Theology and Religion via a reception at the AAR where I had a very informative conversation with Raymond Williams.

Person responsible for narrative reports:

Jay M. Hammond

See contact information under project director above.

Person responsible for receiving the check and providing financial reports:

Douglas Leavell

Director, Sponsored Programs

SalusCenter, Rm 509Phone: (314) 977-2380

3545 Lafayette AvenueFAX: (314) 977-1894

St. Louis, MO63104E-mail:

Signature of the official authorized to sign grant requests for the institution:

______

Carole KnightDate

Associate Dean

Verhaegen Hall, Rm 305Phone: (314) 977-3925

3634 Lindell Blvd.Fax: (314) 977-3943
St. Louis, MO63108-3414Email:

Signature of the project director:

______

Jay M. HammondDate

PART II: Project Description

1. Title:Advancing Ongoing Program Review as Proactive Pedagogy by

Creating Learning Communities that Foster Assessment

2. Proposal Abstract: Although the Department of Theological Studies at Saint LouisUniversity commits itself to the assessment of student learning, there remains no standardized model for ongoing program review that clearly links classroom learning with program outcomes.[1] Thus, annual reviews of the programs’ structures and curriculums are not accompanied by effective assessment exercises that provide actionable feedback enabling teachers to improve their pedagogical skills so as to increase the quality of student learning. To address this problem, the department seeks release-time for all interested DTS faculty over three years. Accordingly, four faculty per semester will form learning communities/cohorts and, through ten training seminars and three yearly all-faculty retreats, they will increase their familiarity with, facility toward and participation in the department’s work to formalize a standardized assessment model for ongoing program review. In short, the cohorts will utilize the release time from teaching to understand better how to assess student learning.

3. Goal Statement: To form learning communities that will train our own faculty leaders so there is faculty ownership of the assessment processes as we inaugurate the third phase of the department’s strategic plan:[2]formalize a standardized model for ongoing program review that islearning-centered, pedagogically driven, and practically actionable. To accomplish this goal our plan is sevenfold. The first three points address the objective of “Advancing Ongoing Program Review as Proactive Pedagogy,” the second three points concern the flipside objective of “Creating Learning Communities that Foster Assessment,” and the last point concerns a means by which the department will sustain an ongoing assessment of the assessment processes themselves. Accordingly, the project’s desired outcomesare to:

1) Embed clear assessment rubrics in the department’s program review that make the reflexivity between learning outcomes and teaching methods more visible and practical.

2) Design a protocol for ongoing revised program review that more explicitly provides constructive feedback for advancing a departmental culture of assessment through critical, creative, careful and collaborative reflection about the practices of teaching and learning.[3]

3) Integrate the new electronic delivery/collection of the department’s undergraduate and graduate assessment and evaluation tools (to be implemented fully Fall 2005) into an inclusive MySQL database platform that closes the assessment/evaluation loop by directly linking formative assessment (pedagogical methodology) and summative operations (program review and re-accreditation).[4]

4) Foster proactive assessment in the department by conducting three weekend retreats for the faculty where they can reflect on the scholarship of teaching and share their insights and critiques as they devise and implement a standardized assessment model for ongoing program review. Three questions will focus the three retreats: Why are we doing what we are doing? What are we doing? How do we know how we are doing?

5) Offer all interested faculty one-course release time so they can increase their familiarity with, facility toward and participation in the assessment process.[5] The formation of such learning communities will facilitate collective understanding of how the program review begins, transpires and culminates as a collaborative pedagogical enterprise that exists for the sake of increasing the quality of student learning.

6) Compose a departmental handbook on assessment (student learning) and evaluation (faculty teaching) that outlines the model for cultivating ongoing departmental reflection on pedagogical excellence and its implementation. The handbook will be published on the department’s web site: theology.slu.edu.

7) Manifest the department’s commitment to ongoing assessment of student learning by creating a departmental assessment committee to help ensure that the DTS faculty own all assessment processes and that those processes are both practical and pedagogically relevant. In effect, the committee’s charge is to assess regularly the department’s assessment processes.

4. Rationale: “Why” is the Project Needed

For the past six years the Department of Theological Studies at Saint LouisUniversity has conducted program reviews of their undergraduate (core, minor, major) and graduate programs (3 MA and PhD). SWOT analysis of these review processes reveals a serious weakness: there still remains no standardized model for ongoing program review that clearly manifests: (1) measurable learning outcome matrices that use both (2) formative, course embedded assessment measures, and (3) summative rubrics for retrieving/interpreting those measures, to (4) assess student learning that (5) provides feedback for improving teaching objectives/methods. Three facts contribute to this weakness: (1) the existing assessment protocol is an administrative shuffle that is largely perfunctory; (2) as an “external” administrative directive, the main focus of assessment is on summative accountability with little or no emphasis on “internal” formative responsibility on the part of the faculty; and (3) both these problems stem from the fact that time is the great thief of effective assessment. The resulting dilemma is: (1) little faculty ownership or even understanding of the department’s assessment processes, (2) an accumulation of disparate assessment data without meaningful interpretation, and (3) no timely, systematic, or applicable feedback, i.e., closure of the assessment data loop resulting in actionable information.

Case in point: As the department conducted a program review of the core curriculum, there was particular focus on assessing student learning in the introductory 100: Theological Foundations course. In the fall of 2002 the department (1) adopted a general description of the course, (2) targeted 14 learning outcomes for the course (seven cognitive/conceptual outcomes, four values outcomes, three skill set outcomes), (3) wrote a 41 question pre/post assessment tool and (4) began to administer it to all 100 courses. However, for various reasons, over the past five semesters no effective assessment has yet been done on the 100 level. The accumulation of data has not transpired into effective assessment because data analysis has not occurred. While the web-based delivery of the assessment tool, and its collection via MySQL database beginning in fall 2005, will make retrieving/collating data much more manageable, without summative rubrics for interpreting the outcome measures, actionable feedback resulting in formative/summative assessment of teaching and learning will remain elusive.[6]

Nevertheless, the faculty rightfully say that the students are being educated, for the department has a very strong record of teaching excellence, but without plain assessment measures, how can the department collectively know/show that the students are learning? -- grade distribution alone is not an adequate indicator. If assessment is about measuring and recording student learning so as to improve student learning, then over the past several years the department has not collectively done a good job in practicing what it preaches in a transparent and organized manner. Thus, the crux of the problem: the department presently has a real inability to assess student learning accurately. Such inability has reciprocal consequences. The department misses the opportunity to understand (1) how successful student learning informs effective teaching, and (2) how effective teaching ensures successful student learning. In other words, poor assessment obfuscates the pedagogical circle of teaching and learning, which hinders the department’s mission of educating students.[7]

In response, the learning communities project contributes to ongoing reflection on this critical issue because, by offering faculty release-time to converse on the inherent interplay among assessment-learning-pedagogy, the department will be collectively more aware of what it does so it can better serve its students. Allowing the faculty to work through a “curriculum”[8] as collaborative cohortswill enable individual faculty to build upon/hone the assessment practices they already employ and share those practices with colleagues so assessment becomes more uniform, understood, and transparent department wide. The learning communities’ “curriculum” is underwritten by the research of Charles Glassick et al., Barbara Walvoord, and Ken Bain,[9] which collectively detail how student learning and teacher pedagogy mutually enhance one another. Such critical reflection will support the department’s commitment to develop an assessment protocol that seamlessly integrates assessment measures into teaching and learning processes at every step: from the classroom, through program reviews, to administrative reports for accreditation.

5. Project Design and Outline: The Project’s “Who, What, When Where and How”

A. “Who” will run the project?Jay Hammond will serve as the project director and will run its day-to-day operations. However, he will work with a superb project team whose expertise will ensure that the five learning cohorts receive progressive, comprehensive and multifaceted training in assessment as proactive pedagogy. From within DTS,[10]Wayne Hellmann, chair, and James Ginther, director of graduate studies, will provide logistical support and critical/creative project critique. Inter alia, their services will include: (1) coordination of course offerings in light of faculty release-time, (2) scheduling project updates to the faculty during regular faculty meetings, (3) assistance in planning, implementing and evaluating the project over three years, (4) oversight of the writing of the departmental handbook on assessment and evaluation, and (5) membership on the new departmental assessment committee. The project will also tap personnel from the wider SLU community, especially Mary Stephen, director, SLU’s ReinertCenter for Teaching Excellence (CTE),[11] Julie Weissmann, assistant provost, Office of Planning and Decision Resources, which conducts SLU’s assessment efforts,[12] and Andrew Wimmer, academic IT services manager, who also serves as an instructor in the DTS. Their expertise will be indispensable in designing the “curriculum” for the ten training seminars that will structure the activity of the DTS learning communities. Further details provided below under resources.

B. “What” will happen? The project’s goal of formalizing a standardized model for ongoing program review that is practically applicable and pedagogically relevant according to seven targeted outcomes will be accomplished by the following design, activities, and resources.

Design [see Table 1, p. 8]: The project’s general design consists of two parts: (1) five learning cohorts consisting of four faculty each and (2) three all faculty weekend retreats. The five learning cohorts will each engage in ten seminars that provide training specifically geared toward the realization of ongoing assessment as proactive pedagogy. Five of the seminars will be uniform in that all the cohorts will cover the same basic “curriculum” (readings, projects, presentations), and five seminars will be tailored to meet the specific focus of different cohorts, that is, activities that work toward addressing different dimensions of standardizing an assessment model for ongoing DTS program review. The customized seminars will direct the cohorts toward the three retreats where they will address the three basic questions of those retreats: Why are we doing what we are doing? What are we doing? How do we know how we are doing? The cohorts’ “reports” will thereby frame and lead a departmental dialogue so issues can be identified, discerned and evaluated, allowing for collectively enacted decisions. The design is intentionally collaborative on two levels. First, the four member cohorts collaborate to (1) generally learn assessment as proactive pedagogy, (2) specifically consider one of the three questions for the three faculty retreats, (3) work on a specific action items related to those questions, and (4) present their findings at the retreats. Second, the all faculty retreats will then provide collaborative feedback so when the next learning cohort takes up the baton, they can reflect upon and incorporate their colleagues concerns/comments as they move the project forward. The retreats will also provide evaluation of the project as it progresses.

Activities: While the specific “curriculum” for the learning cohorts will be devised in fall 2005 and adapted as the cohorts provide evaluative feedback, the ten seminars will involve such subjects as: (1) keeping assessment simple,(2) DTS assessment and The Five Dimensions of the Saint Louis University Experience,[13] (3) effective classroom assessment techniques, (4) embedding assessment outcomes/measures in the syllabus, (5) synthesizing micro/formative and macro/summative assessment outcomes, (6) technology and assessment, (7) designing effective rubrics, (8) understanding how SLU’s assessment apparatus works: from the classroom to accreditation visits, (9) using the new DTS assessment and evaluation electronic tools for pedagogical improvement, and (10) differentiating between grading and assessment. Whatever the final seminar topics, they will all enhance the view that ongoing assessment is proactive pedagogy. In fall 2005, a questionnaire will survey the faculty to discern what they would like to learn. Their feedback will be engaged as the seminar topics are formalized.

Resources: The project’s resources consist of two parts: materials and personnel. (1) The cohorts’ “curriculum” will contain reading from a designated bibliography that will also be determined in fall 2005 and amended as needed throughout the three-year project. (2) Mary Stephen and the CTE staff will help organize the learning communities, devise their readings and projects, and will approach the topic of assessment as proactive pedagogy from the perspective of the classroom. Julie Weissman and the University Assessment Committee will share best practices and resources for assessment methods, identify assessment experts among SLU’s faculty who can work with the learning communities, and approach the topic of assessment as proactive pedagogy in terms of program evaluation from the administrative perspective.Andrew Wimmer and the ITS staff will develop a pilot program for DTS that combines several software applications into a fully integrated web platform with backend database for administering/collecting assessment instruments, instruct the cohorts in learning the technology’s characteristics and capabilities, which will add a technical component to the topic of assessment as proactive pedagogy. Given these resources, the cohort’s “curriculum” will be interdisciplinary.

C. “Where” and “When” will the activity take place?

Since the learning communities will largely utilize existing resources available at SLU, their activities need no special external location. However, the all faculty, weekend retreats will require special accommodations. The Pere Marquette Lodge and ConferenceCenter, approximately 40 miles from SLU, provides such accommodations.[14] To ensure the success of the all faculty retreats, they need gravitas, that is, appropriate time and space must be allotted to the endeavor. Moreover, the three retreats will each occur in April, but the specific weekends will be determined at a later date.

D. “How” will the project accomplish all this?

Table 1 and accompanying narrative explain the plan for achieving the project’s objectives.