General Education Quadrennial Review 2016-2019

General Education Quadrennial Review 2016-2019

Assessment Report

for the

General Education Quadrennial Review 2016-2019

Please submit the completed electronic copy to

Instructions:
  • For questions 6 – 9, cut and paste the text from the approved QRII Assessment Plan in the appropriate sections.
  • Contact Academic Affairs and Research if you need a copy of the approved plan.
  • Please clearly indicate if changes have been made to methods outlined in approved plan.

1. Course Prefix and Number: ENG1003

2. Course Title: COMPOSITION I

3. Contact Persons (Department Chair Name, Department, Email Address, Phone Number)

  • Janelle Collins, Ph.D., Chair of English, Philosophy, and World Languages, , (870) 972-2429
  • Kristi Costello, Ph.D., Director of the Writing Program and Writing Center, , (870) 972-2429
  • Airek Beauchamp, Assistant Director of the Writing Program and Writing Center, , (870) 972-2222

Framework

View the General Education Goals and associated SLOs and courses here: http://www.astate.edu/a/shared-governance/shared-governance-committees/general-education-committee-/documents/Gen-ed-goals-with-outcomes-and-associated-courses-final-Spring-2014.pdf

4. Please mark with an X the Student Learning Outcome (SLO) you chose to evaluate.

Quadrennial Review Assessment Timetable
Review Year / General Education Goal / Student Learning Outcome / Due for submission
2016 / Communicating effectively / Students will be able to:
Construct and deliver a well-organized, logical, and informative oral or written presentation, accurately documented, that demonstrates proficiency in standard American English ☐x / Monday, October 3, 2016
Using mathematics / Students will be able to:
(1) Interpret and analyze quantitative/mathematical information (such as, formulas, graphs, and tables) ☐
(2) Apply mathematical methods to solve problems ☐
2017 / Developing a life-long appreciation of the arts and humanities / Students will be able to:
(1) Recognize works of literature or fine arts and place them in their historical, cultural, and social contexts ☐
(2) Interpret works of fine arts or literature ☐ / Monday, October 2, 2017
2018 / Developing a strong foundation in the social sciences / Students will be able to:
(1) Explain the processes and effects of individual and group
behavior ☐
(2) Analyze events in terms of the concepts and relational proposition generated by the social science tradition ☐ / Monday, October 1, 2018
2019 / Using science to accomplish common goals / Students will be able to:
Understand concepts of science as they apply to contemporary issues ☐ / Monday, October 7, 2019

5. Connection – Briefly explain how the learning outcome you selected above relates to the discipline of the course being assessed, i.e. how does this course speak to the proposed learning outcome?

Composition as a discipline inherently focuses on communicating arguments and ideas and utilizing rhetoric in an intentional and skillful manner. It is the charge of our Composition I offerings to ensure that students are able to construct and deliver a well-organized, logical, and informative oral or written presentation, accurately documented, that demonstrates proficiency in standard American English.

  • For questions 6 – 9, cut and paste the text from the approved QRII Assessment Plan in the appropriate sections.
  • Contact Academic Affairs and Research if you need a copy of the approved plan.
  • Please clearly indicate if changes have been made to methods outlined in approved plan.

6.Assessment Instrument Description – (Briefly describe the instrument, including how the instrument is a valid measure of the outcome. Submit the actual instrument at the end of the document under the Appendix (originally #10 on previous form).)

As an instrument of assessment we designed a rubric to assess Composition I students’ written work. Composition faculty worked together at our pre-semester assessment workshop to decide what constitutes “effective communication [in writing]” and, more specifically, what constitutes a “well-organized, logical, and informative written presentation.” The rubric includes collaboratively-written narrative step-down language in an effort to codify performance indicators. As such, this rubric (used 2014-2015) asked raters to assess each piece of writing on a scale of one to four (with four being the highest score) in three different categories, which represent a hierarchy of elements: content and thesis, organization and coherence, and style and mechanics (see Appendix 1 for rubric).

Then, with the help of ITTC, we created an online repository to collect students’ essays so we can distribute them among faculty to read at our assessment workshop. For the 2016-2017 academic years, we have, based on faculty feedback at the 2015 assessment workshop, added an additional section collaboratively created by faculty members to assess students’ understanding of MLA. Similar to the remainder of the rubric categories we have used written narrative step-down language in an effort to codify performance indicators (see Appendix 1).

6.1 – Identify and explain deviations, if any, from the approved assessment instrument.

There was no deviation from the approved assessment instrument.

7. Benchmark (What is the expected level of student proficiency related to the learning outcome?)

In Composition I, we hope at least half of our students will earn scores of three or four in each of the three areas: content and thesis, organization and coherence, and style and mechanics (i.e., we hope that students will be proficient in “communicating effectively [through writing]” and we hope at least 80% of the students will receive scores of 2 or better in each of the categories and thus be considered “approaching” or “emergent.” We hope to see these scores improve in Composition II, while also recognizing that the work we assign students in Composition II is more complex and intellectually rigorous.

7.1 – Identify and explain deviations, if any, from the approved benchmark.

While there was no deviation from the approved benchmark, we do hope to use the data from this report to develop more quantitative and defined benchmarks for the QRIII.

8. Data Collection Process (Describe the data collection process and any planned sampling strategies. Consider the following items: term/s, section/s, location/s, modalities, and the sampling process. The data collection process should ultimately include all students taking a general education course or give all students taking the general education course an equal probability (i.e. random sampling) of being included in the data sample. This includes the Paragould campus courses, concurrent credit courses, and online, web-assisted, and traditional course formats. )

For the last two years, every fall semester, all A-State Jonesboro Composition I instructors, including concurrent faculty members, are required to include in their Composition I syllabi that three required essays need to be submitted by the students to the Composition Assessment website (http://ittc-web.astate.edu/comps/) upon submitting a final draft to the course instructor. Every fall, Composition I students submit a narrative essay, an analysis essay, and an argument essay ranging from 750-1200 words (see Appendix 2 for expectations). One genre of these essays is then rated the following summer by Composition faculty using the common rubric that accounts for “fundamentals of written communication”: Content/Thesis, Organization/Coherence, and Style/Mechanics.

Based on feedback from the Gen. Ed. Committee, we plan to rate each genre for two cycles before moving on to the next genre (see Appendix 3 for a proposed schedule). For the 2014-2015 and 2015-2016 A.Y.s we assessed narrative. In 2016-2017 and 2017-2018, we will assess analysis. In addition to comparing writing across genres, after the Composition program has assessed a full cycle, we can begin engaging in comparative analysis of essays in the same genre over time (i.e., comparing the narrative essays of fall 2014 and 2015 to the narrative essays of fall 2020 and 2021). Since Comp I is a multi-genre course, we feel it is important to assess multiple genres to know that students can write effectively across genres, not just in one. This will help us ensure that we are meeting our goal of assessing general "effective communication" (through writing), not specifically students’ abilities to write narrative essays.

Prior to formally rating essays, faculty members engage in norming to establish inter-rater reliability. Following norming and the calculation of an above 95% confidence-level, with a confidence interval of +/- 10%, and a corresponding representative sample, each essay, stripped of all identifying information, is randomly distributed to two readers who each read independently.

Currently the maintenance of the online repository is too onerous and the system too basic to allow multiple submissions from different classes at one time (i.e., Comp I and Comp II) concurrently, particularly while being able to differentiate between them (i.e., draw separate data from each). However, our hope is that, in the next year or two, through either the purchase of a commercial software or the hiring of a full-time software engineer, we will be able to create a system where Comp I and Comp II students can go to the same website to upload their essays and that the Director of the Writing Program will be able to personally upload classes, instructors, and students so we can ensure the system is ready to go each semester when we need it to be. Thus far, we have had to rely on other busy professionals with ITTC and IT to do this work for us. Once we have a fully functioning system, we are open to assessing both Comp I and II in the fall and the spring if the committee thinks it best.

In the meantime, however, we are fairly confident that assessing Comp I in the fall semester is the best plan for now since, between our traditional and concurrent students, our representative sample is drawn from more than 80% of Composition I’s annual enrollment (data gathered from calculating enrollment in Fall 2015-- 991, Spring 2016 Composition I--208 offerings). Additionally, because several students who take Composition I in the spring are students who unsuccessfully completed the course the previous semester, we currently do not have a system in place to ensure that we’re not evaluating these students twice.

8.1 – Identify and explain deviations, if any, from the approved data collection process.

There was no deviation from the approved data collection process.

9. Planned Number of Observations (To the best of your ability, estimate the number of observations expected from the data collection process for the reporting period. Example: 120 expected observations (30 students per year for 4 years))

For a confidence interval of 90%, we needed to rate a representative sample of approximately 60 essays from Fall 2014 Composition I.

9.1 – Identify and explain deviations, if any, from the planned number of observations.

We actually achieved a better confidence interval and level than we had approved from the Assessment Committee. We had a 95% confidence level with a confidence interval of +/- 10% by assessing 82 (with two scores per essay for a total of 164 observations) from Fall 2014’s 496 total and 78 ((with two scores per essay for a total of 155 observations-- one was corrupted) from Fall 2015’s 434 total.

10. Results and Analysis of Assessment

10.1 – Results (What did you find? As appropriate, report both item and aggregate results)

Table 1: Composition I—Narrative Essay—Fall 2014

Frequencies/Percent

4 / 3 / 2 / 1
Content and Thesis / 12.2% / 40.9% / 30.5% / 16.5%
Organization and Coherence / 15.2% / 37.2% / 36% / 11.6%
Style and Mechanics / 14.0% / 52.4% / 27.4% / 6.1%

Table 2: Composition I—Narrative Essay—Fall 2014

Frequencies/Percent—Total Rubric Score

12 / 11 / 10 / 9 / 8 / 7 / 6 / 5 / 4 / 3
4.3% / 6.7% / 9.8% / 20.1% / 13.4% / 17.7% / 14.6% / 6.7% / 3.0% / 3.7%

Table 3: Composition I—Narrative Essay—Fall 2015

Frequencies/Percent

4 / 3 / 2 / 1
Content and Thesis / 17.4% / 48.4% / 31.0% / 3.2%
Organization and Coherence / 16.8% / 45.2% / 37.4% / .6%
Style and Mechanics / 10.3% / 49.7% / 36.1% / 3.9%
MLA / 35.5% / 40.6% / 18.7% / 5.2%

Table 4: Composition I—Narrative Essay—Fall 2015

Frequencies/Percent-- Total Rubric Score

16 / 15 / 14 / 13 / 12 / 11 / 10 / 9 / 8 / 7
2.6% / 9.0% / 8.4% / 9.7% / 11.0% / 20.0% / 20.0% / 7.7% / 8.4% / 3.2%

10.2 – Analysis (How did you interpret what you found, i.e. what are your conclusions?)

In Composition I, we hoped at least 50% of our students would earn a score of three or four in each of the three areas and at least 80% of our students would score 2 or better in each of the three areas: content and thesis, organization and coherence, and style and mechanics.

We were pleasantly surprised to see that in Fall 2014, 53.1% scored 3 or above in content and thesis, 52.4% scored 3 or above in organization and coherence, and 66.4% scored a 3 or above in style and mechanics—though this high score may be in part because MLA formatting was, at the time, figured into style and mechanics and it is quite simple to format a narrative essay in MLA style. Overall, in Fall 2014, 54.3% of students received an average score of 3 or above and 93.3% scored 2 or above, which exceeds our benchmark. Though I am concerned with the occurrences of scores of 1 from the Fall 2014 data, I am also aware that the Fall 2014 data was compiled just one year after the Writing Program was re-established and, at that time, there were only two Rhetoric and Composition specialists on staff.

We were also pleasantly surprised to see that in Fall 2015, 65.8% scored 3 or above in content and thesis, 62% scored 3 or above in organization and coherence, 60% scored a 3 or above in style and mechanics, and 76.1% scored three or above in our new category, MLA. Overall, in Fall 2015, 88.4% of students received an average score of 3 or above and 96.8% scored 2 or above. There are incredibly few incidences of scores of 1, which seems consistent with what we see in the classroom.

Though I’d like to say the improved scores between 2014 and 2015 are representative of the newly hired composition specialists (we now have four), additional time spent norming essays together, the new professional development and resources provided for Composition faculty, the increased standardization of our courses, and, as a result of all of these factors, better teaching-- and I do posit that all of these things have combined to improve the quality of our students’ writing and the teaching taking place in our Composition courses-- I am also aware that the increased participation of our graduate assistants in the rating of the essays (from 14% of the raters being graduate students in 2014 to 25% in 2015) might partly account for the higher scores in the narrative essays—since many of our grad students are creative writers. This is further illustrated through the higher incidences of raters saying they felt their scores were higher than their colleagues on the confidential post-assessment workshop survey in Fall 2016 than in Fall 2015 (see Appendix 6).

In sum, I do believe that the data further illustrates what I am seeing in our meetings and workshops, and what I observe when observing Composition faculty, that writing instruction on this campus has improved and, as a result, the quality of the writing produced in our Composition courses is also improving.

11. Action Plan

We first plan to encourage more faculty members to utilize the assessment rubric in classroom settings for each genre through in-class norming and for grading purposes. This will not only help the faculty members be more comfortable using the rubric and students understand how assessment works and how they are scored (if their essay is selected), but it will also help to ensure that Composition faculty are keeping in mind the shared goals, outcomes, and expectations of the course (i.e., to teach students “effective communication [in writing]” and, more specifically, what constitutes a “well-organized, logical, and informative written presentation”).

As our inter-rater reliability (IRR) numbers suggest, the narrative essays presented a unique challenge for assessment, as they tend to be more creative or subjective. To increase our IRR and come to a more shared understanding of what constitutes effective narrative, we plan to:

  1. engage in further norming for the narrative essay with faculty;
  2. add additional student narrative samples in Pack Prints-- our anthology of student writing that serves as a textbook for Comp I and II;
  3. provide more resources for faculty for teaching narrative on Composition Instructor Network, the online repository we’ve set on Blackboard that contains sample Composition I and II assignments, readings, and lesson plans;
  4. engage in discussions at our pre-semester workshop about ways to interpret and assess characteristics of a narrative and how they might apply in different ways according to different instructors.
  5. continue to revisit and, if needed, revise our rubric to better apply to writing across genres, including narratives.

We are already anticipating that the way we had originally conceptualized analysis, which was broad to allow for utmost faculty autonomy, might accrue similar IRR issues and dissimilar experiences for students. We discussed this at the 2016 pre-semester workshop and have, in response, mainstreamed aspects of the assignment across faculty. From here on the analysis assignment will ask students to write a thesis-driven essay focusing on one text (some analyses in the past were process analyses or comparative analyses) and use this text to demonstrate their ability to perform the following functions: to craft a well-developed and coherent thesis; to effectively use supporting evidence; to engage alternative points of view; to recognize the audience and exigency of the situation; to understand the features of the genres they are analyzing; to expose logical fallacies; and to demonstrate an understanding of the rhetorical devices in play. Finally, we are also working to include more resources on Composition Instructor Network and more analysis instruction in Pack Prints.

11.1 – Who has been involved in the action plan?

Ideas for the action plan were generated and agreed upon by the entire faculty present at the pre-semester Composition workshop and then written up by Airek Beauchamp and Kristi Costello for inclusion in this report.

APPENDIX

  1. Assessment Instrument
  1. MLA Section of Rubric
  1. Composition Outcomes, Assignments, and Best Practices
  1. Composition Assessment Schedule
  1. Inter-Rater Reliability Statistics
  1. Essay Rater Survey Data and Analysis
  1. Pre-Semester Workshop Agenda
  1. Screenshot of Assessment Website

APPENDIX 1: ASSESSMENT INSTRUMENT

Table 5: Assessment Rubric

Content and Thesis / 4
The writer articulates the thesis clearly and presents cogent evidence in favor of his or her thesis in every paragraph. Or, in the case of narrative essays, the author’s main idea is clearly and expertly woven and supported throughout the essay, though the thesis itself may be implied. The content of the paper challenges the intelligence and sophistication of a college audience and is clear to readers beyond the classroom. / 3
The writer states the thesis reasonably clearly—the reader does not need to guess or even to infer the paper’s thesis or main idea—and supports the argument with solid evidence and reasons or pertinent experience/s (in the case of narratives). In one or two spots the evidence may seem flimsy, the argument one-sided, or the student may have included a couple of details that were unnecessary, but overall the writer presents a careful, sound, and convincing argument or main idea. The content of the paper is thoughtful and engaging to a college audience as well as clear to readers beyond the writer’s classroom. Additionally, the student may have tried to take content and/or rhetorical risks that were incredibly close to paying off. / 2