1

Core Curriculum Assessment Report

Spring 2010 Project:

Evaluating Writing Intensive Course Papers

Using the AAC&U VALUE Rubric

Learning Objective for Core Communication Competency:

Students will be able to write and speak to specific purposes, audiences, and contexts.

I. Writing Evaluation Project Overview and Methodology

Overview

Spring 2010 marked the fifth year of a writing evaluation project in which a group of faculty used a writing rubric to evaluate student writing samples collected from courses during the academic year. The use of existing student course work in assessment is called for in the Daemen Core Assessment Plan. The 2010 evaluation session took place on May 26, from 9:30 a.m. to 2:30 p.m. in RIC building.

In Fall 2008, a new Daemen Writing Committee was formed, comprised of the Writing Coordinator, English department faculty, faculty from other H&HS and A&S division departments, along with the Core Director and the Director of Institutional Assessment. The Writing Committeehas helped guide writing evaluation efforts for the last two years.

Based on recommendations from the Writing Committee, CIS, CAC and the English faculty, 2010 marked the first year that the core writing evaluation project usedthe AAC&U VALUE Written Communication Rubric. In previous years, the Daemen College writing rubric was used. Faculty reviewing the two rubricsconcluded that the Value rubric is more comprehensive, aligned with national standards for writing, and better meets Daemen College needs (see Appendix and “Evaluation Process”).

Collecting Student Writing Samples

The May 2010 Core Writing Evaluation project focused on papers from WI courses at all levelsincluding CMP 101. An ongoing challenge of Core Assessment efforts is the collection of existing student work samples to be used for evaluation. Rather than administering a standardized test or requiring students to prepare a writing sample for assessment purposes, the Core Assessment Plan uses work submitted by students in their regular classes. For this project, the Core Director put out a call to both CMP and WI instructors requesting electronic copies of ungraded writing, along with the assignment instructions. Student work samples were collectedfrom CMP 101 courses and upper and lower level WI courses. Instructor and student names were removed from all papers. This is the second year that an electronicstorage system was used. The process asks each WI instructor to require that students submit their papers to the electronic storage system (part of the Core e-portfolio), for the use of both the instructor and Core assessment efforts.

Writing Intensive Work Samples:

Table 12009-2010 Daemen Writing Intensive Course Sections that Submitted Student Work Samples
Semester / WI Course Sections
Offered / Total RP Course Sections Offered / Total CMP101 Course Sections Offered / WI Sections that Submitted Student Work / RP Sections that Submitted Student Work / CMP101 Sections that Submitted Student Work / Total WI Submitted Student Work -
All Categories
Fall 09 / 20 / 2 / 17 / 12/20 / 2/2 / 11/17 / 25/39
Spring 10 / 21 / 13 / 6 / 8/21 / 4/13 / 4/6 / 16/40
Total / 41 / 15 / 23 / 20/41 / 6/15 / 15/23 / 41/79
2009-10 Overall Instructor Participation in Submitting Work Samples
15/31 (48%) Writing Intensive instructors submitted work samples.

Writing samples from several of the available classes samples did not fit the evaluation criteria of the Daemen Writing Rubric (for example, they did not incorporate documented research, they were too brief to demonstrate the 4 areas of competency outlined in the rubric, etc.), and were therefore excluded from the 2009-10 Writing Evaluation project.

The total number of papers eligible for sampling from all offered WI courses for both semesters was 454which represented papers submitted by 15out of 31instructors, or 48% .

Sampling for Writing Intensive Papers

Forty-one (41) out of 79 (52%) approved WI core courses provided papers. The 41 courses provided 454 papers available for sampling, 163 at the 100-200 Level and 291 at the 300-400 Level. Using stratified random sampling, 15 papers were selected from the 163 available from 100-200 Level approved WI courses ( including CMP 101) and 27papersselected from the 291 available papers from300-400 level approved WI courses (including RP courses). The papers were numbered from 1-43 and randomly divided into 14sets of approximately 3 papers each. Each set of three papers was rated by a pair of faculty evaluators. Faculty did not know which papers were selected from 100-200 or 300-400 WI level course. Papers numbered 16-43 were from 300-400 WI level courses. Paper number 40 was used for norming and one paper was not rated resulting in a total of 42 papers.

Samples were selected from the following courses:

AC430 (1) / CMP311 (3) / LIT 309 (1) / PSY302 (3)
BA211 (2) / CMP315 (4) / LIT112 (3) / PSY311 (3)
BA220 (1) / EDU319 (1) / LIT320 (1) / PSY321 (2)
BA443 (1) / FOR101 (3) / LIT329 (1) / SOC303 (5)
CMP101 (6) / HCS300 (1) / NSC331 (1)

Evaluation Participants

The following 30 faculty from both the H&HS and A&S divisions participated as evaluators on May 25, 2010: Bob Morace, Brenda Young, Bridget Niland, Bruce Shields, Cheryl Nosek, Chris Brandjes, ClaudiuMihai, Colleen Kashino, Ellen Banks, George Siefert, Jeff Arnold, Joel Patterson, Karl Terryberry, Kevin Telford, Kim Charmatz, Kristin Fries, Laurie Walsh, Linda Kuechler, Melissa Fiori, Michele Flint, Norollah Talebi, Penny Messinger, Robert Gunther, Ron Schenk, Shawn Kelley, Shirley Peterson, Susan Krickovich, Tae Hyung Kim, Zena Hyman, and Mimi Steadman. The group represented a variety of departments: Business Administration, English, History and Government, Foreign Language, Physician Assistant, Physical Therapy, Accounting, Education, Mathematics, Natural Sciences, Nursing, Psychology, Social Work, and Philosophy and Religion. As indicatd in the table below, faculty participation has tripled since the initial writing evaluation project in 2007.

Evaluator Preparation

To prepare for the evaluation process, each evaluator received a copy of the rubric and instructions, along with AAC&U publication on VALUE rubrics. After a brief overview of the Core Writing Evaluation project by Intisar Hibschweiler and Mimi Steadman, Erica Frisicaro-Pawlowski provided an hour-long norming session on the VALUE rubric linking it to Daemen College earlier rubrics.

As noted previously, this year, assessment groups were asked to rate student papers using the AAC&U VALUE rubric in Written Communication. The VALUE rubric differs from the Daemen College rubric in three significant ways:

■The Daemen College Writing rubric ranks students’ writing products, seeking out evidence of particular features in the final texts. The AAC&U Written Communication rubrics measure the effectiveness of students’ writing processes, seeking out evidence of particular “moves” the writer makes in fulfilling a purpose.

■The Daemen College Writing rubric quantifies the quality of 5 skills: Unity, Coherence, Voice, Diction, Grammar, and Research & Information Literacy. The AAC&U Written Communication rubric quantifies the quality of 5 processes – namely, considerations of: Context and Purpose for Writing; Content Development; Genre and Disciplinary Conventions; Sources and Evidence, and Control of Syntax and Mechanics.

■The Daemen College Writing rubric aligns skill rankings and overall assessment with external measures of student progress (from high-school level writing to professional writing). The AAC&U Written Communication rubric does not equate evidence of writing skill with such external measures. Instead, it evaluates student writing in terms of progress toward mastery (from benchmark to capstone). Thus, while the two rubrics are somewhat aligned at their highest and lowest levels (1=benchmark or college entry-level skills and 4=capstone or college exit-level skills), there is some difference between levels 2 & 3 (between AAC&U’s “Milestone” categories and the Daemen Writing Rubric’s “First-Year College Student” and “Upper-Division College Student” categories).

A brief introduction to these distinctions was provided prior to the session, and sample papers were normed before small groups began assessment. Each of the 42 student papers was then scored by a pair of raters, who were encouraged to discuss their scoring decisions as part of the ongoing orientation and calibration process to promote consistent application of the five-level VALUE Written Communication rubric. The raters were not informed if their writing samples were CMP101, 100-200 level, 300-400 level, or RP courses.

Evaluation Process

To determine the score from the ratings of two evaluators, faculty were asked to compare their scores. If the scores were the same, they entered their score on the rating sheet. If their scores differed, they were asked to discuss their rationale for their rating, and see if the discussion resulted in a change of opinion by either of the raters. If the two raters’ scores were different,a third rater was brought in to re-read the paper. Each group had a leader that could help the evaluators in that group locate a third rater to assess the papers, etc.

It is worthwhile to note that this is the first time the evaluation process involved the use of online grading; papers were not printed but were stored on-line for faculty to access. Printing was available to faculty who wished to grade using hard copies. Electronic review of papers allowed groups to spread out throughout the RIC building using available laptop or desktop computers, and only one faculty member opted to print student papers, thereby saving college resources. However, online evaluation may have also encouraged groups to spread out throughout the library, thus posing challenges to intra-group communication.

II. Results

Scoring Consistency

As indicated in Table 2 below, the scoring consistency within a pair of raters increased greatly from the pilot year of the writing evaluation in 2005-06 to subsequent years. In the first pilot year (2005-06), evaluators rated papers independently on their own time and in their own offices, and no introduction to the rubric or norming session was provided. Starting in 2006-07, evaluators worked together on the same day in the same room, following an orientation session. Inter-rater consistency was highest in 2007-08 when the session organizers encouraged pairs of raters to discuss their ratings and make an attempt to come to consensus on the score given. In 2008-09, the orientation session did not emphasize that pairs of raters should attempt to reach consensus, so raters whose scores were only one level different simply marked their scores on the scoring sheet (see appendix) and moved on to rating the next paper. This year faculty were asked to discuss the writing components in each paper and agree on the overall score, or a third rater was used. Some evaluators, instead of providing a comprehensive score (ranking 1-4), added their total assessment in each category, then provided a single cumulative assessment (0-20). In cases where a cumulative score was provided, comprehensive scores were determined by dividing the cumulative score by 5 and looking at the individual category rating for that paper.

Ninety-five percent of the faculty agreed on the overall score and only2 papers needed a third rater. After tabulating the data,some inconsistencies were noticed between the overall score and individual categories. For example, one paper was rated at 2 in most individual categories but reviewed a score of 1 overall. Six of such papers were assigned to veteran assessors for a third or, at times, fourth rating before assigning the overall score for the paper. The data was then updated and is shown in Table 3.

Table 2Consistency in Ratings of Same Student Papers by Two Evaluators
Both Faculty Rated Paper the Same / Scores Differed by One Level / Scores Differed by Two Levels / Scores Differed by Three Levels
2009-2010 WI Papers
Number / 40 / 2 / 0 / 0
Percent / 95% / 5% / 0 / 0
2008-2009 25 CMP 101 Papers Rated with New 3 – Level First-Year Writing
Number / 19 / 6 / 0 / N/A
Percent / 76% / 24% / 0 / N/A
2008-09 (35 CMP Papers)
Number / 29 / 6 / 0 / 0
Percent / 83% / 17% / 0% / 0
2008-2009 (36 WI Papers)
Number / 27 / 9 / 0 / 0
Percent / 75% / 25% / 0% / 0
2007-08 (30 WI Papers)
Number / 23 / 7 / 0 / 0
Percent / 76% / 24% / 0% / 0
2007-08 (30 CMP Papers)
Number / 30 / 0 / 0 / 0
Percent / 100% / 0% / 0% / 0
2006-07 (40 CMP Papers)
Number / 32 / 7 / 1 / 0
Percent / 80% / 17.5% / 2.5% / 0
2005-06 (26 CMP Papers)
Number / 9 / 13 / 4 / 0
Percent / 34.6% / 50% / 15.4% / 0
Table 3 Consistency in Ratings of Same Student Papers by Two Evaluators per Category
Purpose / Cont Dev / Disc Conv / Evid Source / Mech
Diff by 0 / 35(83%) / 38(90%) / 38(90%) / 36(86%) / 36(86%)
Diff by 1 / 5 / 3 / 3 / 4 / 5
Diff by 2 / 2 / 1 / 1 / 2 / 1

Results: Evaluation of Writing Intensive Papers with VALUE Writing Communication Rubric

The 2010evaluation included a sample of forty-two (42) papersfrom all levels ofWriting Intensive courses. WI papers were scored using the four-level AAC&U VALUE Writing Rubric. Following the rating session, most evaluators commented that they found the VALUE Writing Rubric appropriate for rating WI papers. One concern expressed by the evaluators was the lack of assignment instructions provided with the writing samples. Evaluators found it easier to evaluate papers that were accompanied by a description of the assignment. Assignment descriptions were requested, but not submitted by all instructors who provided student work samples. Scores for the two years that WI papers were evaluated are displayed in the Table 4 and Figure 1.

Table 4Student Scores on Writing Intensive Papers
Papers Evaluated Using AAC&U VALUE Rubric
Score Level: / 1 (Benchmark) / 2
(Milestone) / 3 (Milestone) / 4
(Capstone) / Total
2009-10 100-200 Level WI Courses
Number / 13 / 2 / 0 / 0 / 15
Percent / 87% / 13% / 0 / 0 / 100
2009-10 300-400 Level WI Courses
Number / 12 / 10 / 4 / 1 / 27
Percent / 44% / 37% / 15% / 4% / 100%
Papers Evaluated Using Daemen College Writing Rubric
Score Level: / Level 1
(1 or 1.5) / Level 2
(2 or 2.5) / Level 3
(3 or 3.5) / Level 4
(4) / Total
Description of Rubric Level / Meets expectations of an incoming student / Meets expectations of a first year student
who has completed
CMP 101 / Meets expectations of an upper-classman’s writing / Writing similar to that of a college graduate or professional
2008-09 300-400 Level WI Courses
Number / 14 / 14 / 8 / 0 / 36
Percent / 39% / 39% / 22 / 0 / 100
2007-08 100-200 Level WI Courses
Number / 11 / 4 / 0 / 0 / 15
Percent / 74% / 26% / 0 / 0 / 100
2007-08 300-400 Level WI Courses
Number / 5 / 9 / 1 / 0 / 15
Percent / 33% / 60% / 7% / 0 / 100

Figure 1Percentage of 300-400 Level WI Course PapersRated at Levels 1-4 on Daemen Writing Rubric (2007-08 and 2008-09) or

AAC&U VALUE Rubric for Written Community (2009-10)

Percentage of Student Scores at Levels 1-4

The results of previous evaluations of 300-400 level Writing Intensive course papers suggest that students’ writing improves over time during college. Results this year were consistent with prior results;52% writing samples from Writing Intensive course in the 300-400 level were rated at level 2 or 3, compared to only 13% writing samples from Writing Intensive course in the 100-200 level.

The 44% (10) of 300-400 WI papers rated at a Level 1 is an area for concern. However, because this is the first time the VALUE rubric is being used, more data from various years are needed to draw meaningful conclusions. In addition, this year marked the first time that evaluators did not know whether they were reading first-year writing or 200-400-level writing. In previous years, groups were divided into either CMP 101 or WI clusters, which may have influenced readers’ perceptions of students’ grade or skill level.

III. Recommendations

Writing Committee, Evaluators, English Department, Others-

Thank you for reviewing this draft report!

Please suggest any recommendations or next steps based to follow up on the results

of the writing evaluation.

Also, please feel free to suggest any observations or interpretations or rationale

for the results.

Don’t hesitate to point out typos, errors, or areas where you may disagree with what is written.

Your edits and recommendations will be included in the next draft of this report

that will be circulated campus-wide.

Thank you.

Prepared by Intisar Hibschweiler, Core Director; Mimi Harris Steadman, Director of Institutional Assessment; and Erica Frisicaro-Pawlowski, Writing Coordinator

1

Prepared by Intisar Hibschweiler, Core Director; Mimi Harris Steadman, Director of Institutional Assessment; and Erica Frisicaro-Pawlowski, Writing Coordinator

1

Daemen College Core Assessment Project: Communication Competency

Scoring Record and Instructions for Writing Evaluation

May 19, 2009

STUDENT PAPER #______

Overall (4-Level) Score: Reader #1______Reader #2 ______

If the paper receives a score of 1, please use the green rubric to assess the paper, and then fill out the scoring form below:

Please enter sub-scale scores below:
Dimension of Student Writing / Rater 1
Score / Rater 2
Score
Clarity of Purpose
Organization
Academic Conventions
Grammar and Mechanics

Scoring instructions for overall score (Table below):

Compare the two scores:

  • If the two scores agree, enter that score in the “final score” box.
  • If the two scores disagree:

1. Discuss reasons for ratings, and see if discussion results in any changes of opinion by one of the raters.)

2. If the two scores are only one level apart (i.e., 1 and 2, or 2 and 3), average the two scores and enter the average score in the “final score” box.

3. If the two scores are more than one level apart, a third rater is needed. Please notify Intisar or Mimi that a third rater is needed.

  • If a third rater is brought in:

1. Compare the three scores.

2.If two scores agree, enter that score in the “final score” box.

3. If the three scores are different, use the middle score, or call for Erica for further discussion

Student Paper
Number / Rater 1
Overall Score / Rater 2
Overall Score / Rater 3
Overall Score
(only if needed) / Final
Score
Appendix A: Daemen College Writing Assessment Rubric
For each dimension of student writing (unity, coherence, etc.), please circle below the descriptor at the level that best represents this student’s work.
Level 4 – The paper indicates that the author exhibits all of the following principles of writing: (Meets expectations equal to a professional or college graduate) / Level 3 – The paper indicates that the author does all of the following: (Meets expectations of an upper-classman’s writing level)
□ / Unity: Thesis or purpose is clearly stated and supported in body of paper by a variety of relevant facts, examples, and illustrations from experience, references to related readings, examples, detail. / □ / Unity: Thesis or purpose is presented and well supported in body of paper by facts, examples, and illustrations from experience, references to related readings, examples, detail.
□ / Coherence: Major points are organized and divided into paragraphs and signaled by use of transitions and sentence variety. Introduction and conclusion effectively related to the whole. / □ / Coherence: Most major points are organized and divided into paragraphs and signaled by use of logical transitions and consistent sentence variety. Introduction and conclusion effectively related to the whole.
□ / Voice, Diction, and Tone are consistent and appropriate to the college-level audience. / □ / Voice, Diction, and Tone are consistent and appropriate to a college-level audience although somewhat generic or predictable in places.
□ / Few, if any, minor errors in sentence construction, usage, grammar, punctuation, or mechanics. / □ / Minor or major errors in sentence construction, grammar, punctuation, usage, or mechanics do not detract from the essay’s mission or create obstacles for the reader.
□ / Research and Info Literacy: Source material is incorporated logically and insightfully, and sources are documented fully and accurately. / □ / Research and Info Literacy: Source material is incorporated logically, but in some cases, may create disconnectedness. Sources are documented accurately.
□ / □
Level 2 – The paper indicates that the author does all of the following: (Meets expectations of a first-year college student) / Level 1 – The paper indicates that the author does many or all of the following: (Does not meet expectations of a first-year college student)
□ / Unity: Thesis or purpose is clearly or implicitly stated and topic is partially limited. Thesis or purpose is minimally supported in body of paper by facts, examples, and details. / □ / Thesis or purpose is unclear and/or inadequately supported in body of paper by few facts, examples, details. More than one paragraph with inadequate support.
□ / Coherence: Essay is generally organized with major points divided into paragraphs and signaled by use of logic and transitions. Sentence variety is limited and monotonous. Introduction and conclusion are somewhat effective. / □ / Coherence: Only some major points are organized and/or set off by paragraph. Transitions are abrupt, illogical, and weak. Sentence variation is limited and monotonous. Introduction and conclusion may be lacking, misdirected, or ineffective.
□ / Voice, Diction, and Tone are adequate although often generic or predictable, informal, and conversational. / □ / Voice and Tone noticeably generic or inappropriate (e.g. first person narrative may predominate in an analysis assignment). Diction dominated by conversational language/slang or inaccuracies.
□ / Errors in sentence structure, usage, grammar, punctuation, and mechanics do not interfere with writer’s ability to communicate the purpose but present obstacles. / □ / Consistent major and minor errors in sentence construction, grammar, punctuation, usage, or mechanics that disrupt the writer’s ability to communicate the purpose.
□ / Research and Info Literacy: Source material is incorporated adequately and usually is documented accurately. / □ / Research and Info Literacy: Source material incorporated but sometimes inappropriately or unclearly, creating coherence breaks. Documentation is accurate occasionally.
□ / Developed by Dr. Karl Terryberry / □ / Revised Draft: Spring 2008

Appendix A2: Background Information on the Daemen College Writing Assessment Rubric