Frostburg State University

Redesign of College Algebra

Carnegie Course Redesign Final Report

Cohort 2

PART ONE: IMPACT ON STUDENT LEARNING

Improved Learning

Our measures of educational quality related to the two types of students taking College Algebra: those taking it to meet a general education requirement alone, and those taking it to meet a prerequisite for a later course.

General Education Students

To address the effects of redesign on students of the first type we used ten questions common to the final exams for all sections of the course, redesigned and traditional. The ten common questions appear below.

1. Solve the equation |2 – 3x| + 2 = 9.

2. Solve the equation 2x3 + 5x2 – 8x – 20 = 0.

3. Solve the equation 12x+3+4=2x.

4. Solve the equation x6 – 7x3 – 8 = 0.

5. Solve the inequality for x and graph the solution set. 231-x465.

6. A 1000-acre farm in Illinois is used to grow corn and soybeans. The cost per acre for raising corn is $65, and the cost per acre for growing soybeans is $45. If $54,325 has been budgeted for costs and all the acreage is to be used, how many acres should be allocated for each crop?

7. Callaway Golf Corporation has determined that the cost C of manufacturing x Big Bertha golf clubs may be expressed by the quadratic function

C(x) = 4.9x2 – 617.4x + 19,600.

How many golf clubs should be manufactured to minimize cost, and what is that minimum cost?

8. A child's grandparents purchase a $10,000 bond that matures in 18 years to be used for the child's college education. The bond fund pays 4% interest compounded semiannually. How much will the bond be worth at maturity?

9. The function f given by the formula fx=2-x3+x is one-to-one. Find the formula for its inverse, f –1, and verify that your answer is correct by finding and simplifying ( f –1◦f ) (x).

10. Determine the formula for the quadratic function that has the given graph.

All exams were administered on the same day and at the same time. The instructors for all sections shared the grading. Each instructor graded two of the ten common questions and graded those questions for all students. Each question was graded on a scale from zero to ten. Students’ names appeared only on the first pages of the tests, which had only two questions on them. So any instructor bias in grading was minimal.

Students in the redesigned sections who had not passed all four module exams were not eligible to take the final exam, so we have no data for students who failed the course due to not passing earlier material. Nearly everyone in the redesigned sections who was eligible for the final ended up passing the course, so comparisons are of students in each type of section who ended up passing the course. Here are the averages for those students.

Question / Average
Traditional / Average
Redesigned / Difference / t-test
p-value
1. Absolute Value Equation / 7.5 / 8.5 / 1.0 / 0.04608
2. Solve by Factoring / 7.4 / 8.7 / 1.2 / 0.03023
3. Radical Equation / 7.5 / 7.9 / 0.5 / 0.22361
4. Quadratic in Form / 6.3 / 6.9 / 0.5 / 0.24908
5. Inequality / 6.9 / 7.2 / 0.3 / 0.33149
6. Word Problem / 8.2 / 6.6 / –1.6 / 0.02159
7. Max/min / 7.1 / 7.0 / –0.1 / 0.42338
8. Compound Interest / 8.4 / 7.7 / –0.7 / 0.15214
9. Inverse Function / 5.7 / 5.8 / 0.1 / 0.45097
10. Quadratic Function / 5.9 / 6.6 / 0.7 / 0.17260
Totals / 71.0 / 72.9 / 1.9 / 0.32094

So, students in the redesigned sections performed statistically significantly better on two questions and were outperformed significantly on one question. They did outperform students in the traditional sections on five out of seven of the remaining questions, but those differences could have been due to chance.

In summary, there is not much evidence to say that students from the redesigned sections performed better than their counterparts. This is less than we had hoped for but, on the positive side, they definitely did not do any worse.

Students Headed to Courses for which College Algebra is a Prerequisite

To explore the effect of redesign on students of the second type (those who were taking the course as a prerequisite for another) we will be using several tools: a diagnostic test at the beginning of the term in Precalculus, questions on the final exam in Precalculus, and a diagnostic test at the beginning of the term in Calculus. We will also look at the passing rates in those two courses. It is too early to have used any of those tools except for the diagnostic exam in Precalculus. That diagnostic test had the following contents, most of which the careful reader will recognize.

Solve the following equations.

1. |2 – 3x| + 2 = 9

2. 2x3 + 5x2 – 8x – 20 = 0

3. 12x+3+4=2x

4. x6 – 2x3 – 8 = 0

5. Solve for x and graph the solution set. 231-x465.

6. A 1000-acre farm in Illinois is used to grow corn and soybeans. The cost per acre for raising corn is $65, and the cost per acre for growing soybeans is $45. If $52,000 has been budgeted for costs and all the acreage is to be used, how many acres should be allocated for each crop?

7. Construct the quadratic function that has the given graph (that is, find its formula).

A wide variety of comparisons could be made based on the results of this exam. We will limit ourselves to comparing (1) students from redesigned courses (FSU-R) and students from traditional courses taught at FSU (FSU-T), (2) FSU-R and students who took a traditional course at FSU only one term ago (FSU-T1) to control for the effects of time on retention, and (3) FSU-R with all other students. Here are the pertinent results.

FSU-R versus FSU-T
Question
Averages / 1 / 2 / 3 / 4 / 5 / 6 / 7 / Total
FSU-R (n = 10) / 8.2 / 6.8 / 7.6 / 6.4 / 7.6 / 6.6 / 3.2 / 46.4
FSU-T (n = 23) / 5.1 / 4.6 / 4.4 / 4.6 / 4.0 / 5.9 / 1.8 / 30.3
Difference / 3.1 / 2.2 / 3.2 / 1.8 / 3.6 / 0.7 / 1.4 / 16.1
p-value (t-test) / 0.00579 / 0.07694 / 0.00017 / 0.12078 / 0.00054 / 0.32484 / 0.17711 / 0.00152

On five out of seven of these questions, as well as overall, FSU students from redesigned sections did statistically significantly better than FSU students from traditional sections.

FSU-R versus FSU-T1
Question
Averages / 1 / 2 / 3 / 4 / 5 / 6 / 7 / Total
FSU-R (n = 10) / 8.2 / 6.8 / 7.6 / 6.4 / 7.6 / 6.6 / 3.2 / 46.4
FSU-T1 (n = 14) / 5.8 / 5.4 / 5.6 / 5.5 / 4.0 / 6.9 / 2.1 / 35.2
Difference / 2.4 / 1.4 / 2.0 / 0.9 / 3.6 / -0.3 / 1.1 / 11.2
p-value (t-test) / 0.03014 / 0.21170 / 0.02157 / 0.29634 / 0.00344 / 0.43559 / 0.24428 / 0.02841

Adjusting for time elapsed since taking College Algebra, FSU students from redesigned sections still outperformed FSU students from traditional sections, but on only three of the seven questions and the total. Note that FSU-T1 students actually outperformed (though not at a statistically significant level) FSU-R students on the word problem. This is consistent with results from the common questions on the final exam.

FSU-R versus All Others
Question
Averages / 1 / 2 / 3 / 4 / 5 / 6 / 7 / Total
FSU-R (n = 10) / 8.2 / 6.8 / 7.6 / 6.4 / 7.6 / 6.6 / 3.2 / 46.4
Other (n = 44) / 5.1 / 3.7 / 3.9 / 3.6 / 2.9 / 4.4 / 1.8 / 25.5
Difference / 3.1 / 3.1 / 3.7 / 2.8 / 4.7 / 2.2 / 1.4 / 20.9
p-value (t-test) / 0.004764 / 0.018516 / 0.0000001 / 0.029281 / 0.000005 / 0.074812 / 0.172018 / 0.000082

In summary, the redesigned course seems to have done a noticeably better job of preparing students for subsequent work in mathematics.

Improved Retention

During both the pilot and full implementation, the redesigned course did more poorly than the traditional course as far as DFW rates. Here are some details.

The passing rate over the period from Fall term of 2000 through Spring term of 2011 was 54.25%, with a minimum rate of 36.76% in Spring 2007 and a maximum rate of 65.60% in Spring 2000. Grade distributions during the pilot phase were as follows.

Grade Distributions
Traditional / Redesigned
A / 10 / A / 3
B / 9 / B / 10
C / 27 / C / 25
D / 3 / D / 0
F / 14 / F / 29
Other / 19 / Other / 38
Total / 82 / Total / 105

These yield passing rates of 56.10% for the traditional sections and only 36.19% for the redesigned sections. To test for statistical significance the Chi-square procedure was used for three different comparisons.

Traditional Versus Redesigned

Expected values, based on the percentages for the traditional sections and the 105 students in the redesigned sections, are 0.5610 × 105 = 58.90244 passing and 0.4390 × 105 = 46.09756 other. Observed values were 38 and 67. The p-value for the Chi-square test was 0.00004.

Historical Versus Redesigned

Expected values, based on the passing rate from Fall 2000 to Spring 2011 and the 105 students in the redesigned sections, are 0.5425 × 105 = 56.9625 passing and 0.4575 × 105 = 48.0375 other. Observed values were 38 and 67. The p-value for the Chi-square test was 0.0002.

Historical Versus Traditional

Just for completeness, I compared expected values based on the passing rate from Fall 2000 to Spring 2011 and the observed values from the traditional sections. Expected values in that case are 44.4850 passes and 30.1432 other. The observed values were 46 and 33, respectively, yielding a p-value of 0.57.

In summary, the redesigned sections had a much lower passing rate, and it is very unlikely that this difference is due to chance.

For full implementation, we introduced a few changes to try to address this poor passing rate. The overall grade distributions at the end of each module were as follows.

As you can see, going into the final exam passing rates looked very promising. We gave essentially the same final exam at the end of this full-implementation term as we had during the pilot. The results were surprising. Our overall passing rate was a very disappointing 46.25%. It seemed as if hardly anyone prepared for the final exam.

So, to summarize, although we did raise the passing rate during full implementation, that rate still needs much improving. We have made some changes aimed at addressing this problem and hope to see some real improvement by the end of this term.

PART TWO: IMPACT ON COST SAVINGS

Savings to the Institution

We have not yet fully implemented our cost-savings plan. Until we have a redesign that meets our original goal of an improved passing rate, we intend to staff the course with the original full-time faculty members who were part of the redesign team. (The adjunct faculty who was originally part of the team had to pull out for unrelated medical reasons.) As a result, we have yet to save the institution any money. The included Course Planning Tool (see the appendix) was prepared under the assumption that we will eventually succeed at this goal. If/when that happens, we will begin saving the institution approximately $26,000 per year.

Savings to the Students

Unless we can get that passing rate up, we will have saved our students next to nothing. At the latest DFW rate of 54%, we have actually raised our students’ average cost by $113. If we can get the passing rate up to the long-term average of 54.35%, we will be saving our students $70 on average. If, and this is our goal, we can get the passing rate up to a modest 65%, then we will be saving our students approximately $260 per term. The appended MD Cost Saving spreadsheet shows the actual loss to our students.

For the sake of both the institution and the students, we plan to continue making adjustments to our redesign for a while until either we succeed in this goal or it becomes apparent that we have failed.

PART THREE: LESSONS LEARNED

Pedagogical Improvement Techniques

·  Requiring students to complete all the assignments at specified mastery levels. I think this is by far the main contributor to improved student learning. Students who probably would have passed anyway did better in later courses (see above).

Cost Reduction Techniques

·  Use of lab assistants. This allowed us to do the following.

·  Raising the size of sections. This enabled us to deliver the material to the same number of students using about half the full-time and adjunct faculty.

·  Use of software. Although this did not lower costs for every student, those who would have purchased a new textbook certainly saved a bundle.

Implementation Issues

·  Giving too much help to students. Initially, we made all of the online assistance available to our students on their homework assignments. We found this to be counter-productive as many students got into the habit of relying too heavily on examples provided by the software. They would request an example and work their exercise in close parallel with the example. After getting the right answer, students would not remember anything of the process. Currently, online help on homework is limited to access to the textbook and emailing the instructor for assistance.

·  Allowing an unlimited number of attempts at homework exercises. Although this would seem to mirror the traditional setting, wherein a student could attempt a homework exercise as many times as needed to get it right, we found this not to be the case. In the worst cases, students would knowingly enter incorrect answers just to see (and write down) the actual answer. Then they would cycle through all the available variations of the exercise until they recognized one for which they had the answer written down. We have changed to allowing only a small number of attempts at any exercise, after which students are required to see the instructor for individual help.