OUTCOMES ASSESSMENT SUMMARY REPORT FOR 2010-2011

College of Liberal Arts and Sciences, University of Colorado Denver

Submitted by Jeff Franklin, Associate Dean for Undergraduate Curriculum and Student Affairs, June 2011

The College of Liberal Arts and Sciences(CLAS) now has established an active culture of assessment in academic programs and other areas. This is the result of major efforts in recent years by chairs/directors, theirfaculty, and College administrators. This is not to say that everyone that should be engaged is engaged or that there are not improvements to be made, but this is occasion for congratulations, especially to the chairs/directors and participating faculty. The College has come a long way, not merely in compliance but in coming to use assessment for the right reasons, which are for improving our programs, curricula, pedagogy, and student learning.

To enhance the culture of assessment in the College, CLAS created an assessment web page-- /academics/colleges/CLAS/faculty-staff/faculty-resources/teaching/Pages/OutcomesAssessment.aspx--containing information on college-wide assessment activities as well as information on best practice in assessment. Assessment plans and resulting reports for every degree program (plus some minors), as well as assessments of the CLAS general-education(Core + CLAS graduation requirement) courses are on file with the University Director of Assessment, Dr. Kenneth Wolf, Additionally, assessment activities occurred in the Advising Office and the Writing Center. Based on the assessment activities in 2010-2011, modifications to courses and course sequencing are occurring, as are services in the Writing Center and the Advising Office.

The purpose of this report is twofold: 1) to provide an overview of assessment activities within CLAS, along with examples of how different units have used assessment to revise their practices; 2) to provide feedback and suggestions to chairs/directors and other faculty for making our assessments increasingly useful to us.

College-Level Assessment Activities in 2010-2011

Cross-College assessment activities occurred in three areas in 2010-2011: CLAS general-education assessment, the Writing Center, and the Advising Office. Each of these activities is summarized below.

General-Education Assessment: In fall 2010, 18 out of the total 26 CLAS academic units or programs (including majors, minors, graduate-only programs, and the Writing Center) participated in gen-ed-course assessments. Sixty CLAS courses and 141 sections of those courses were assessed (out of the total 400 sections offered by CLAS at all levels in fall 2010). This level of participation is alone a major accomplishment for the College. The learning of an estimated 3,807 students was assessed. To summarize at the broadest level, 70-90% of our students, on average, demonstrated that they are learning what we want them to learn, performing at the "exceeding expectations" or "meeting expectations" levels.

Details of this assessment appear in the "CLAS General-Education Assessment, Summary Report for 2010." This report includes examples of assessment practices and how assessment results have been used for revision of pedagogy and curriculum. It also includes recommendations for future gen-ed assessment within CLAS. These results were sent to the faculty of CLAS during the spring semester through the CLAS listserve. Additionally, they were sent to the Director of Assessment, the CU Denver Core Curriculum Oversight Committee (CCOC), and the CU Denver downtown campus Assessment Committee for comment. The report is (or soon will be) available on the CLAS assessment website.

The Writing Center: During the spring 2009 semester, the Writing Center defined a series of learning goals which they tested during summer 2009. The results of the pilot test, goals were revised and learning outcomes were clarified. The Writing Center uses a variety of data collection methods including student surveys, instructor surveys, individual session data management and analysis of student project where multiple drafts of a paper were available to determine improvement. Data collection for a full assessment occurred in the fall 2009 semester by an assessment committee for goals 1-3 (students can compose a clear, concise thesis statement; students can organize information into distinct paragraphs that support an argument; and students can develop ideas to support an argument). Specifically, each member of the committee read papers and scored them without knowing if the paper was an early draft written before consultation with the writing center or a later draft, after consultation with the writing center for each of these goals. A comparison of the scores for the early vs. late drafts demonstrates an improvement in student papers.

Advising Office: The CLAS Advising Office provides a broad range of valuable services to students and to the College, contributing to everything from orienting new students to transfer-credit checking to semester-by-semester advising to serving as staff liaisons to College committees to working with the major advisors to degree auditing and graduation checking, to name a few. Since their primary responsibility of advising students throughout their academic progress begins with the initial contact, their current "Assessment Plan" focuses on the effectiveness of their efforts to orient students about topics such as what a liberal arts education is, how to use the advising system, and what the various degree requirements are. They therefore designed a survey to be administered to students after attending one of their "Advising 1001" orientation sessions. The survey was piloted in summer 2009, refined, and administered in full in summer and fall 2010 orientations. The data, details of which are available from Assistant Dean Carol Morken, show the orientations already to be highly effective. Several years of data will allow longitudinal analysis and consideration of where improvements are taking place and could take place.

Program/Department Assessment Activities in 2010-2011

As Table 3 below shows, nearly all degree-granting programs within CLAS, as well as the Composition Program and the Religious Studies minor, generated program assessment reports for 2010-2011. This level of participation is in itself an indicator of engagement in assessment.

Table 3: Status of Program Outcomes Assessment Reports 2010-2011

Department and degree / Current Chair/director / 2010-2011 assessment reports submitted
Anthropology (BA, MA) / Steve Koester
Biology (BS, MS) / Diana Tomback
Biology, Health Careers / Charlie Ferguson
Chemistry (BS, MS) / Mark Anderson / Department reports, BS, MS
Communication (BA, MA) / Stephen Hartnett / Department reports, BA, MA
Economics (BA, MA) / Buhong Zheng / Department reports, BA, MA
English (BA English, BA Writing major, MA) / Nancy Ciccone / Department reports, BAs, MAs (Lit/Film, Writing, CW, APL)
Composition Program / Amy Vidali / Program report
Ethnic Studies / Donna Langston
Geography and Environmental Sciences (BA) / Brian Page / Department report, BA
GES: Environ. Sci. (MS) / John Wyckoff / Program report, MS
Health and Behavioral Sciences (PhD) / Debbie Main / Department report, PhD
History (BA, MA) / Marjorie Levine-Clark / Department report, BA, MA
Individually Structured Major (BA) / n.a. / n.a.
Humanities & Social Sciences (MA & MS) / Omar Swartz & Margaret Woodhull / Program reports, MA, MS
Integrated Sciences (MIS) / Mary Coussons-Read / Program report, MIS
International Studies (BA) / Greg Whiteside / Program report, BA
Mathematical and Statistical Sciences (BS, MS, Ph.D.) / Mike Jacobson / Program reports: BS, MS, PhD, gen-ed courses
Modern Languages (BA French, BA Spanish, MA Spanish) / Devin Jenkins / French report, German report, Spanish report, BA, MA
Philosophy (BA) / Rob Metcalf / Department report, BA
Religious Studies (minor) / Sharon Coggan / Program report, minor
Physics (BS) / Weldon Lodwick / Department report, BS
Political Science (BA, MA) / Jana Everett / Department reports, BA, MA
Psychology (BA, BS, MA, Ph.D.) / Peter Kaplan / Department reports, BA, BS
Social Justice (minor) / Chad Kautzer
Sociology (BA, MA) / John Freed / Department reports, BA, MA
Sustainability (minor) / John Brett
Women & Gender Stud. (minor) / Gillian Silverman

Rather than provide here a summary of each program's report, I have chosen to take three alternative actions: 1) to excerpt highlights from the CLAS program assessment reports and provide them in the appendices below; 2) to invite those interested in more detail to request exemplary program reports, which either Kenny Wolf or I would be glad to send; and, 3) to take the current opportunity to offer chairs/directors and other faculty some recommendations for conceiving of and undertaking next year's program assessment.

Recommendations for CLAS Academic Program Assessment in 2011-2012

The following recommendations come from two sources. First are the examples provided by this year's assessment reports. Different academic units and programs are at different stages in terms of their knowledge about and execution of program assessment. Some departments are fully engaged and making very constructive use of assessment for pedagogical and curricular improvement. Others are still struggling with the process. The CLAS Dean's Office would like for all departments/programs to come up to the level exemplified by the best of this year's program assessment reports. The second source is the body of scholarship on assessment, and many scholars in this field say the same things, some of which I now will repeat or elaborate.

1. Building engagement with assessment: Are your faculty aware of the program's learning goals/outcomes? Do they incorporate relevant learning goals in their syllabi? Do you distribute the assessment reports to faculty and then hold discussions of them? Do your faculty participate in deciding how to use assessment outcomes for revising teaching and curriculum? Has your faculty recognized that the assessment effort is paying off in terms of improving delivery of your learning goals? Is your faculty's involvement reflected in your program assessment report? See "Appendix A" below for selected examples from this year's program assessments.

2. Integrating your curriculum: This means that specific learning goals are tied to specific courses and that the curriculum is considered holistically as a sequence that builds student learning through 2000 to 3000 to 4000-level courses, for example. Here is a generic example:

Table 4: "Alignment Matrix (Curriculum Map)

Course

/ Outcome 1 / Outcome 2 / Outcome 3 / Outcome 4 / Outcome 5
100 / I, D / I
101 / I / D
102 / D / D / D
103 / D
200 / D / D
229 / D
230 / D, M / M
280
290 / M / D, M / M

I = Introduced, D = Developed & Practiced with Feedback, M = Demonstrated at the Mastery Level Appropriate for Graduation. Some Variations [to consider adding]:R= Review; review of basics added to junior-level courses to ensure that all students have the background for upper-division work, or review of basics for beginning graduate students. C = Consolidation; students given opportunities to consolidate their learning of outcomes that have been previously mastered in the curriculum." Also, one could add "A"s to this matrix to indicate where key assessments are being administered. (Source: Mary Allen, author of Assessing Academic Programs in Higher Education)

Is your curriculum designed to deliver you learning goals? Does an alignment matrix appear in your program assessment report? See "Appendix B" below for selected examples from this year's program assessment reports.

3. Including learning goals in syllabi: You and your students know that your learning goals are integrated with your curriculum when those goals appear in course syllabi at several course levels. Different courses will emphasize different learning goals, depending on the level of the course and its role in delivering particular learning at a particular time in your students' progression through your curriculum.Is anyone checking the syllabi of your program's courses to see that they are delivering the learning outcomes that your faculty has agreed they should? See "Appendix C" below for selected examples from this year's program assessments.

4. Using rubrics: I used to hate the very word "rubric" until I reluctantly designed one for an essay assignment and realized that I now understood the assignment and my own grading criteria for the first time and that I owed it to my students to give them that level of clarity about what I expect them to do. Has your program written a rubric for the senior project? For the MS or MA exam? For the MAthesis? See "Appendix D" below for selected examples from this year's program assessment reports.

5. Closing the loop: Is your faculty discussing the assessment outcomes? Beyond conversation, which in this case is inherently good, are you using the assessment results to make revisions to how learning goals are taught and how curriculum is structured?Does your program assessment report identify specific changes you have or will make to teaching or curriculum?Are you tracking those changes to determine whether they are having the desired effect on student learning? See "Appendix E" below for selected examples from this year's program assessment reports.

6. Assessing theassessment: Is your program assessment providing you with the most useful information you can imagine it providing? Has your program ever assessed the assessment? Are you making revisions to the assessment process itself in order to improve its usefulness to you? Is it time to revisit and rewrite your learning goals/outcomes? See "Appendix F" below for selected examples from this year's program assessment reports.

7. Practicing the full assessment cycle: The primary purpose of program assessment is not to produce a post-hoc report in May that showsthat students did well enough last year. The purpose is to improve teaching, curriculum, and student learning now and next year. This is best facilitated by practicing a full assessment cycle: i) revisiting last year's assessment and the changes you told yourself you were going to try; ii) cementing consensus about learning goals and agreement on how to put them into practice in teaching and curriculum; iii) incorporating learning goals into syllabi, teaching, and curriculum; iv) agreeing upon assessment methods and planning the assessment. These are among the first steps that lead, next spring or summer, to using the 2011-2012 program assessment to make revisions to your courses and curriculum. The cycle is continuous, and next year's assessment begins with revised learning goals, assessment plans, and curricular adjustments now, this August.

"Appendix G" contains a table that may be useful next year for checking off the above activities in which your department/program has engaged.

Assessment Activities Planned for CLAS in the 2011-2012 Academic Year

Assessment data will be recorded, used for program improvements, and reported on in all CLAS departments in the 2010-2011 academic year. Those reports will be submitted to the Director of Assessment in May 2010. Data will be collected for the CLAS graduation requirements during fall 2010 semester and be the subject of faculty conversations in spring 2011. Additionally, assessment activities will continue to expand in both the Writing Center and the Advising center.

Appendix A: Building Engagement with Assessment

"The department faculty discussed the outcomes assessment process at the October 2010 faculty meeting. We decided that this year we would focus on the specific objective of "an ability to integrate and apply multiple social and behavioral science theoretical perspectives to particular health and health care problems" among our first year students. The faculty formed a committee of three people: David Tracer, Jean Scandlyn and Debbi Main (department Chair). To assess the outcomes for each student, we rated students' performance on three criteria taken from the core competency; we evaluated their final papers from each of the two required courses they take in their first semester in the doctoral program: HBSC 7031 Human Ecology and Environmental Adaptation and HBSC 7011 Theoretical Perspectives in Health and Behavioral Sciences 1. Together the committee developed a rubric based on the three criteria and then graded student final papers; the results and interpretation of our findings are presented below. -- Heath and Behavioral Sciences, Debbi Main

"Dr. Sonja Foss prepared a detailed draft document [in response to last year's assessment] for faculty discussion indicating the skills students should be able to demonstrate cumulatively in the writing produced in the Department’s 1000, 2000, 3000 and 4000 level courses. This document is now in the hands of Dr. Stratman and Adjunct Instructor Mary Domenico who will attempt to draft some required and discretionary writing assignment design and evaluation methods for use in faculty classes at each of these levels. This document will attempt to incorporate and be guided by what we have been learning from the use of the Department’s base scoring rubric in our annual assessments." -- Communication, Jim Stratman

"Over the last few years, we have been having discussions about the number of weak comprehensive exams in the department. We decided to institute a new comprehensive exam process. Rather than having students complete three essays in one day, we now give them one week, which means they receive the exam questions on a Monday morning and return the exam on a Friday afternoon. We also now limit students to 1500-1800 words for each essay question. The results have been much better exams across the board. Students have the time to think more productively, edit their writing, and concentrate on creating arguments and analysis." -- History, Marjorie Levine-Clark