CHEM 351 at UMBCInterim Report - September 2013

Course Redesign Interim Report II

I. Brief description of the course

·  CHEM 351, Organic Chemistry I at UMBC

·  Annual enrollment: 600-700 students, primarily sophomores

·  Required for multiple majors, notably BIOL, BIOC, CENG, CHEM, CHED

·  Required for pre-health professional students

II. Issues that needed to be resolved

·  DFW rates (ranging from 25-50% and generally increasing in recent years)

·  Course drift (delivery not content)

·  Student perception (fear)

III. Choice of redesign model = Supplemental

·  Pedagogical model = “flipped” classroom

o  Additional on-line assignments, readings, video for pre-class preparation

o  3 weekly meetings maintained for active problem-solving and discussion

·  Targeted, on-demand help

o  On-line assignments for post-class mastery

o  Undergraduate learning assistants in and out of class

·  Focused effort to support student 3D visualizing ability

o  Increased use of traditional models by students & faculty

o  Increased use of computer models by students & faculty

IV. Description of pilot

·  Pilot, on-semester cohort - completed

o  2 traditional sections (N=454, Fall 2011)

o  2 redesign sections (N=481, Fall 2012)

·  Pilot, off-semester cohort - completed

o  1 traditional section (N=170, Spring 2012)

o  1 redesign section (N=186 , Spring 2013)

·  Assessment Plan

o  DWF rates

o  Hourly exam scores (overall and by question)

o  Final exam scores

o  Student Surveys (midpoint & end of semester)

o  Faculty Interviews

V. Results of Pilot I, Fall 2012: anecdotal, qualitative and quantitative

·  DWF rates: The DFW rate did not change significantly (25.6% in Fall 2011 and 24.1% in Fall 2012). There was some shift in the type of passing grades. The largest shift was seen in the number of A grades, which increased from 13.7% to 18.1%. These results are summarized in the following chart.

·  Hourly exam scores: The overall exam averages are represented in the following chart. More detailed analysis of performance on individual questions and learning objectives reinforces the overall trend. The biggest difference in seen in the Exam 2 scores; the students in the pilot Fall 2012 course performed >10% better than the students in the traditional Fall 2011 course. Of note, this exam covered the topic of stereochemistry – this represents the material most closely linked to 3D visualizing ability, a content area that was targeted in the redesign. Not evident from the chart is the trend in the standard deviation. In both courses, the standard deviation increased from Exam 1 to Exam 4; this effect is more pronounced in the Fall 2012 pilot course.

·  Final exam scores: These are not particularly relevant for comparing the pilot to the traditional course since the exam format was completely different. The pilot course used a standardized national exam which will be useful for comparing the pilot to future offerings of the course.

·  Student Surveys: Students were asked for their feedback at the mid-point and at the conclusion of the semester.

o  The majority of students (76%) indicate that the course takes more time than their other STEM courses, but they do not report that it is too much or more than they expected.

o  Students report that they are interested in the course material (53.6% very or extremely interested), although student confidence in the material lags behind their interest (34.9% very or extremely confident).

o  Among students who were repeating the course, 89.8% indicated that the course structure was better in Fall 2012 than their previous experience.

o  A majority of students (50.9%) indicated that they enjoy learning about chemistry more than they used to before taking CHEM 351, while 32% were neutral on this topic.

·  Faculty Interviews: Formal interviews were conducted by the evaluator.

o  The on-line textbook resources presented several challenges that demanded more time than anticipated

§  The students had trouble learning the molecular drawing software which created a lot of extra e-mails both early on and throughout the semester.

§  The on-line resources were not well-vetted. Errors were common in the on-line resources and created an extra burden for the faculty to anticipate and report them to the publisher.

o  The quality of student questions in class was better than previous offerings of the course (this was noted by the undergraduate Learning Assistants also).

o  We were able to work through and discuss problems in class that we had never gotten to in previous semesters except as end-of-chapter questions.

o  The spread between students at the top and bottom is greater than previous semesters.

§  Some students are taking advantage of the extra resources and flourishing (those who come to class prepared are getting more out of class time).

§  Some students are falling through the cracks (those who don’t read the textbook are missing more than in the past because we are not lecturing and they are unable to understand/contribute to the class discussion).

o  Faculty satisfaction was high - faculty reported being energized and very positive about the level of student interaction & engagement.

o  The Learning Assistants have benefited and learned from this experience; one of them is now a chemistry education major. We collected data from them informally in our weekly meetings, but a more formalized survey would be a good addition.

·  Important Issues to be resolved

o  How can we persuade lower performing students to engage with the material earlier before it is too late to catch up?

§  Student maturity and willingness to take responsibility for his/her education is key to success in this “flipped” model, is there something else we can do?

§  What is the appropriate level of accountability in the Pre-Class Assignments (high, low, no stakes)?

§  What is the appropriate level of accountability in the class discussion (high, low, no stakes)?

o  Will we be able to stay with the textbook (will the on-line issues be resolved)?

VI. Results of Pilot II, Spring 2013: anecdotal, qualitative and quantitative

·  Changes based on experiences in the fall semester pilot

o  Increased the scaffolding of concepts to be sure that students have extracted necessary information from their pre-class preparation.

§  Incorporate more conceptual questions for in-class discussion

§  Incorporate more “lecture” time to reinforce key concepts

o  Increased the stakes for pre-class preparation to be sure that students are ready to participate in class discussion

§  Pre-class assignments had limited attempts (3) compared to the unlimited attempts allowed in the initial fall pilot.

§  The number of reading assessment questions was increased to four compared to the two used in the initial fall pilot.

·  Results - direct comparison of the two pilot sections (Fall 2012 versus Spring 2013) is not particularly useful due to the differing student populations. In particular, close to 50% of students in the spring semester are repeating the course, whereas that number is less than 10% in the fall semester. All comparisons below use the Spring 2012 course as a baseline.

o  Grades awarded and DFW rates are somewhat improved.

§  As: spring ’12 – 13%; spring ’13 – 17%

§  Bs: spring ’12 – 22%; spring ’13 – 21%

§  Cs: spring ’12 – 33%; spring ’13 – 37%

§  DFW: spring ’12 – 33%; spring ’13 – 25%

o  Test scores improved in three out of four midterm exams.

§  Exam 1: spring ’12 – 61.8; spring ’13 – 69.2

§  Exam 2: spring ’12 – 55.9; spring ’13 – 63.4

§  Exam 3: spring ’12 – 57.2; spring ’13 – 56.2

§  Exam 4: spring ’12 – 50.0; spring ’13 – 59.1

§  Final exams can’t be compared because of an error in exam administration

o  Final Exam Scores: These are not particularly relevant for comparing the pilot to the traditional course since the exam format was completely different.

o  Student survey data were very similar to the Fall 2012 pilot.

o  Faculty interviews revealed very similar comments to the Fall 2012 pilot.

VII. Plans for Full Implementation, Fall 2013

·  The instructors for both pilots (Fall 2012 & Spring 2013) are teaching in the fall to maximize the integration of best practices from both pilot semesters

o  Each of the two sections has a designated instructor, graduate technology TA and team of undergraduate Learning Assistants

o  The on-line resources (both before and after class) are uniform for both sections, the instructors are working together to create and/or update materials from the pilot

o  Hourly exams will be written by each instructor for his/her own section to account for natural variations in the in-class discussions with different student populations

o  Both sections will take the same final exam which will be directly comparable to the Fall 2012 pilot

·  Changes planned based on the pilot semesters

o  The changes that were begun in the Spring 2013 pilot will be continued in Fall 2013 implementation

§  The key goal is to continue to refine the questions that are used for in-class discussion, soliciting input on the effectiveness & value of individual questions from the undergraduate Learning Assistants who were students in Fall 2012

o  Mid-semester & end-of-semester evaluations will have 1-2 questions added to allow us to compare survey answers for key demographics

§  Gender – do both genders respond the same to the class structure?

§  Transfers (both “new” and “old”) – we did not discriminate between students who were recent transfers versus students who had transferred at some point in the past in the pilot courses

o  A new survey about student study habits and motivation for enrolling will allow instructors to understand their students earlier in the semester and address relevant issues at the outset of class

o  There is good reason to expect the on-line resources will be more robust in the Fall 2013 due to lessons learned about best practices and corrections made by the publisher and the Khan Academy – this should allow for more time spent on content for both instructors & students.

4