CMP Evaluation 1
Running head: LONGITUDINAL EFFECTS OF CMP
A Longitudinal Outcomes Evaluation of the Connected Mathematics Project in Maryland Middle Schools
Michael J. Walk
University of Baltimore
CMP Evaluation 1
A Longitudinal Outcomes Evaluation of the Connected Mathematics Project in Maryland Middle Schools
Part 1 - Introduction and Purpose
Gauging the effectiveness of new curricula is an important and ongoing task in today’s educational system. In particular, new programs based on the standards of the National Council of Teachers of Mathematics (NCTM; Bay, Reys, & Reys, 1999), must provide evidence of their superior effectiveness over traditional programs in order to support their inclusion into mainstream education.
One such standards-based program, The Connected Mathematics Project (CMP), was started under a grant from the National Science Foundation, is housed by Michigan State University, and is authored by Fey, Fitzgerald, Friel, Lappan, and Phillips (CMP Website, 2006). The Maryland State Department of Education’s (MSDE) Deputy State Superintendent for Instruction and Academic Acceleration is sponsoring an evaluation of the effectiveness of CMP in ten randomly selected middle-schools from the MSDE system. This study will assess whether or not CMP provides adequate benefit over traditional teaching methods to justify its large-scale initiation in Maryland’s public schools. Results from this evaluation will be used by the MSDE department of Instruction and Academic Acceleration, the teachers providing math training to MSDE students, and parents and families of Maryland students.
CMP Evaluation 1
Because of Maryland’s varied distribution of geographic regions, ethnic groups, and socio-economic conditions, the state serves as an excellent proving ground in which to assess the benefit provided by CMP. The MSDE serves 851,640 students grades K-12, and only 48% of its students are white (2007 Maryland Report Card). Also, Maryland could be well-served by a program such as CMP, provided the program can improve mathematics performance among Maryland’s students. For example, in 2007, only 56.7% of Maryland’s eighth graders performed above the basic level of proficiency on the Maryland State Assessments (MSAs) (2007 Maryland Report Card). Even though this value is an increase over previous years, there is still a performance gap between what is desired by state standards (i.e., Adequate Yearly Progress, AYP) and the level of achievement of certain student groups in various middle schools. For example, in 2007, while 56.7% of eighth graders showed at least a proficient level of mathematics achievement (the AYP goal was 50%; School Improvement in Maryland website, 2007), only 35.2% of African American students and 43.3% of Hispanic students reached a proficient level of performance (2007 Maryland Report Card). (The cut-off score for proficient performance for eighth-graders is 407 out of 800.) In addition, only 34.0% of free-and-reduced lunch students and 24.7% of Title I students reached the proficient level of performance. In order to meet ever-increasing AYPs, MSDE schools have to quickly address the mathematics needs of their student populations in order to bring their performance levels up to state expectations.
This evaluation will address several questions while controlling for pre-existing differences in school environment: (1) Do CMP-students perform better in math than students taught through standard curricula? (2) Do CMP-students gain mathematics ability at a faster rate than that obtained through standard curricula? (3) Do CMP-students perform better at Algebra 1 than non-CMP students? (4) Are the effects of the CMP math curriculum maintained over time?
Part 2 - Literature Review
Program Description
CMP’s overarching goal is stated on its website: “All students should be able to reason and communicate proficiently in mathematics. They should have knowledge of and skill in the use of the vocabulary, forms of representation, materials, tools, techniques, and intellectual methods of the discipline of mathematics, including the ability to define and solve problems with reason, insight, inventiveness and proficiency” (CMP Website, 2006). To this end, the authors have developed a problem-centered math curriculum for middle-school students (grades 6 through 8).
CMP Evaluation 1
The problem-centered approach in CMP revolves around three stages: launch, explore, and summarize (What Works Clearinghouse Website, 2007). The What Works Clearinghouse (WWC) intervention report on CMP (WWC, 2007, p. 2) describes the tasks involved in each stage. In the launch stage, “The teacher launches the problem with the whole class, introduces new ideas, clarifies definitions, reviews old concepts, and connects the problem to students’ past experiences.” In the explore stage, “Students work individually, in pairs or small groups, or occasionally as a whole class to solve the problem.” And in the summarize stage, “Students discuss their solutions as well as the strategies that they used to approach the problem, organize the data, and find the solution.”
There are eight units per grade level, and each unit is centered on a key mathematical idea. The units are sub-divided into investigations, each of which contains a series of problems (WWC, 2007). The program addresses all five of the major math topics (i.e., data analysis and probability, number sense and operations, geometry, measurement, and algebra) within each grade level (Cain, 2002). And “Mathematical ideas are connected to each other and to the students’ real world outside of school” (Cain, 2002, p. 225). Currently, CMP is in use in over 2,400 school districts across all 50 states.
Program Effectiveness
The WWC reviews evaluation research in order to summarize and disseminate the level of effectiveness found for a wide variety of social and academic programs. In the WWC Intervention Report on CMP (WWC, 2007), the program was found to have mixed effects on mathematics achievement. Several studies were perused to come to this conclusion, some of which will be discussed below.
For example, Ben-Chaim, Fey, Fitzgerald, Benedetto, and Miller (1998) investigated the proportional reasoning abilities of seventh-graders. They compared the performance of students receiving one-year of the CMP curriculum to those receiving the traditional curriculum and found that CMP students outperformed non-CMP students.
CMP Evaluation 1
While Ben-Chaim, et al. (1998) conducted a small-N, controlled study, most of the research on the effectiveness of CMP has been conducted using multiple school districts. Cain (2002), Reys, Reys, Lapan, and Holliday (2003), and Riordan and Noyce (2001) all found that CMP schools outperformed non-CMP schools by analyzing school-level data available from standardized assessments. These results generalized across sub-populations and test types.
Research by Ridgeway, Zawojewski, Hoover, and Lambdin (2003, as cited in CMP, 2006) actually found no differences between CMP and non-CMP schools with regard to basic skills (i.e., numbers and operations). However, they did find that CMP schools performed significantly better on problem-solving.
While the bulk of the evidence supports the claim that CMP is as effective as, if not more effective than, traditional mathematics curricula, there remains a great deal of research to be done in order to identify what boundaries limit the program’s benefit and to uncover what other variables might affect the relationship between CMP and mathematics achievement.
For example, when a new mathematics curriculum is introduced in a school, it is often accompanied by an increase in teacher development. That is, teachers have to be thoroughly trained on the new curriculum in order for it to be implemented correctly. Collins (2002) found that the level of professional development affected student outcomes in mathematics achievement. Therefore, it is important that evaluations of mathematics curricula determine how much of the impact of a given curriculum on mathematics achievement is due solely to the increase teacher training. (For examples of studies that included teacher training as a predictor of achievement, see Collins, 2002; Reys, et al., 2003; Riordan & Noyce, 2001.)
Evaluation Criteria
CMP Evaluation 1
There are many different criteria available to measure mathematics achievement. Options include students’ math grades, creating a measure of mathematics achievement, or using some form of standardized test. The MSDE administers the Maryland State Assessments (MSAs) to all students grades three through eight in all of Maryland’s public schools (MSDE Website, 2003). The MSA classifies students by their level of proficiency—basic, proficient, or advanced. The cut-off scores for these categories increase as students enter higher grade levels (see Table 1 for all proficiency category cut-off scores by grade-level). The MSAs have been administered since 2003, and students’ scores have been consistently increasing since the test’s conception; however, there is still room for improvement (see earlier discussion of performance differences between demographic groups).
The MSA is an excellent criterion for the current study because the MSA is used by the MSDE as an indicator of meeting the requirements of the No Child Left Behind Act. While the test is relatively new, it has gone through several stages of pre-testing and design. Also, it contains selected problems from the Terra Nova test. Lastly, the MSA includes questions that require students to verbally explain their method for solving given problems (MSDE Website, 2003). Since CMP is centered on problem-solving techniques, it may be especially suited for preparing students for MSA success. Table 2 presents the parallels between the topical content of CMP and the testing content of the MSA. If CMP can improve students’ mathematics achievement in meaningful ways (measured by MSA score gains), then it will prove a useful tool in MSDE schools.
For assessing mathematics achievement Algebra 1, the Maryland High-School Assessments (HSAs) are administered to all MSDE high-school students. The Algebra 1 / Data Analysis HSA is taken immediately upon completion of Algebra 1—regardless of the students’ current grade level. Table 2 presents the goal statement of CMP and the expectations tested by the HSA. Examining the statements suggests a high level of correspondence between learning goals and testing outcomes.
Taking the Scholastic Aptitude Test (SAT) is a common-place event in the life of any college-bound high-school student. In fact, in 2007, almost 1.5 million graduating seniors took the SAT (Collegeboard Website, 2007). The test is composed of three parts: reading, writing, and mathematics. While scores on the SAT-Mathematics test are far from perfect predictors of college-level mathematics performance (Bridgeman & Wendler, 1989), the scores are important in themselves, because they often serve as admissions criteria and predictors of overall college success (Collegeboard Website, 2007). Therefore, the SAT-Mathematics will serve as an excellent evaluation criterion for the long-term effects of CMP.
Part 3 - Methodology
Team Members
In order to conduct this evaluation, the cooperation of several individuals will be required. The principal evaluation team will be composed of the author and two researchers from the MSDE department of Instruction and Academic Acceleration. The cooperation of mathematics teachers and school administrators will be needed in order to obtain all necessary data.
CMP Evaluation 1
School selection and matching
CMP will be implemented in 10 randomly-selected MSDE middle-schools with a sixth- through eighth-grade organization (rather than, for example, a fifth through seventh or some other variant). Following the methodology of Reys, Reys, Lapan, Holliday, and Wasman (2003) and Riordan and Noyce (2001), school-level archival data (which will be obtained from the Maryland Report Card website) will provide measures of the following characteristics: percentage of students receiving free/reduced lunch, percentage of minority students, and rate of student mobility (i.e., turnover per 100 students). Each school’s geographic region (i.e., distance to closest city) will be calculated using mapping software.Average hours of math teacher training per year will be provided by each school’s Personnel Department. In order to make accurate comparisons between schools, the previous variables were also measured in all non-CMP schools. All CMP schools and non-CMP schools were then ranked according to the combination of these variables, and a matched-comparison school was found for each CMP school. Therefore, data will be gathered from 20 schools in all (10 CMP schools and 10 comparison schools). These school characteristic data will be gathered annually in order to monitor any substantial changes in the variables over time that would limit the appropriateness of the matched pairs. In subsequent years, if a comparison school is no longer compatible to its pair, the pairs will be re-chosen in order to maintain adequate comparisons.
Design and Procedure
CMP Evaluation 1
This evaluation will gather yearly data from six cohorts of students (Cohort A through Cohort F) from 2007 until 2014 from 20 different schools (10 experimental schools and 10 comparison schools). For the purposes of this study, experimental cohorts will be identified by their cohort letter and a subscript indicating the CMP-school number (one through ten). Comparison cohorts will be identified by their cohort letter, a subscript for the comparison school number (one through ten), and a prime mark to designate the cohort as a comparison group. For example, all students in fifth grade in 2007, sixth grade in 2008, etc., will be in Cohort A. Cohort A from CMP-school number three will be identified as A3; the matched comparison cohort will be identified as A3'.
In 2007, MSA scores were gathered from MSDE fifth graders (Cohort A)—before CMP will be implemented in 2008. In 2008, MSA scores will be collected from Cohort A (now sixth graders) and Cohort B (fifth graders). This process of gathering yearly data will continue, and a new cohort will be added every year until 2014 resulting in six cohorts (A through F). Data collection will end in 2014 when Cohort A will be in twelfth grade and Cohort F will be in seventh grade. (See Table 3 for a presentation of this longitudinal design.)
Measures
Mathematics Achievement
CMP Evaluation 1
Proximal achievement. For students in fifth through eighth grade, mathematics achievement will be measured by the math section of the MSA. Raw scores as well as category of proficiency (i.e., basic, proficient, or advanced) will be gathered from all participating students every year of the study. For students who have taken Algebra 1, mathematics achievement will be measured by the Algebra 1 / Data Analysis HSA. Raw scores as well as category of proficiency will be gathered from all participating students after taking the HSA. MSA and HSA scores will be obtained from each school’s student-score database. An anonymous data file containing only student scores and social security numbers (to be used as students IDs for longitudinal data) will be collected from the database.
Distal achievement. In order to examine the long-term effects of the CMP curriculum, SAT-Mathematics scores will be gathered in 2014 from all students in Cohort A. Since most students take the SAT more than once, an average of all math scores will be calculated for each student. SAT scores will also be obtained from each school’s database.
Mathematics improvement. Yearly gain scores will be calculated for every student by subtracting the past year’s MSA score from the current year’s score.
Data Analysis
Research Question 1
Research Question 1 was, “Do CMP-students perform better in math than students taught through standard curricula?” In order to examine the effects of CMP on mathematics achievement, several statistical comparisons will be made. Within each year of the study (excluding 2007) the existing data will be examined for significant differences between CMP and non-CMP schools. Hierarchical regression analyses will be used to examine for effects of CMP on mathematics achievement; a separate regression will be run for each CMP and comparison-group pair. Predictors will be entered in the following order: MSA score in fifth grade (to control for elementary mathematics achievement), cohort (to control for maturity effects), and a school type (i.e., a code for CMP-school vs. comparison school). If school type (i.e., CMP vs. comparison) is a significant predictor of concurrent MSA even after controlling for school context (through the matching-comparison group procedure), previous mathematics achievement, and grade level (i.e., cohort), the data will provide evidence of CMP’s effectiveness over and above traditional mathematics curricula.
In addition, a Chi-square test of independence will be conducted every year to examine the effects of CMP on mathematics proficiency category (i.e., basic, proficient, or advanced).
Research Question 2
Research Question 2 was, “Do CMP-students gain mathematics ability at a faster rate than that obtained through standard curricula?” In order to answer this question, yearly MSA gain scores will be calculated for all participants. These gain scores will be calculated by finding the difference between the student’s previous MSA score and the student’s current MSA score: GAIN = MSAcurrent – MSAprevious. For each CMP and comparison pair, a hierarchical regression analysis will be conducted on GAIN scores. Predictors will be entered in the same order as previously mentioned (i.e., 5th-grade MSA score, cohort, and school type). If school type is a significant predictor of GAIN, then the analysis will provide evidence of an effect of CMP on mathematics improvement over time beyond traditional curricula.