MARSI Results Spring 2013-Spring 2016
Focus Statement
The focus of the QEP is to enhance student reading comprehension, literacy skills, and appreciation of reading, by introducing a variety of reading strategies into select credit courses and by choosing reading-focused activities on campus.
Classroom Level Initiative
To improve reading comprehension and vocabulary acquisition, by integrating instruction designed to improve the students’ ability to select and use appropriate reading and vocabulary strategies into select credit courses.
Institutional Initiative
To foster an environment that is conducive to reading, by providing
•Support for instructional intervention
•Programming that emphasizes reading, and
•Training for faculty and staff
Student Learning Outcomes
- Students will demonstrate improvement in the comprehension of academic reading material.
- Students will demonstrate improvement in academic vocabulary knowledge.
- Students will demonstrate an increased level of awareness of reading and vocabulary acquisition strategies.
Institutional Outcomes
- The faculty will incorporate best practices for teaching reading comprehension and vocabulary acquisition into their instructional material
- The institution will foster an atmosphere conducive to the reading of a variety of genres and types of material, regardless of media, for both academic achievement and personal achievement
Assessment
Students will self-report their use of reading and vocabulary strategies using the MARSI* Indirect (Metacognitive Awareness of Reading Strategies Inventory) or a similar instrument
MARSI
The MARSI (Metacognitive Awareness of Reading Strategies Inventory) provides the third element of our triangulated assessment of the QEP.
- It is a self-reported analysis of behaviors students use to pre-read, read, and review. The strategies are broken down into component parts, and readers rate how often they use the component.
- Aggregated scores indicate which strategies students use more often and which are used less.
- The MARSI, or some form of it, will be administered on the same pre-test/post-test schedule as the Nelson-Denny.
The Meta-cognitive Awareness Reading Strategy Inventory (MARSI; Mokhtari & Reichard, 2002) was developed to assess the type and frequency of reading strategies that students perceive that they use while reading academic materials in English. The MARSI contains 30 items that measure three factors: Global Reading Strategies (13 items), Problem-Solving Strategies (8 items), and Support Reading Strategies (9 items).
The global factor reflects strategies related to the global analysis of text. The problem-solving factor includes repair strategies that are used when text becomes difficult to read. The support factor reflects practical strategies like taking notes and consulting a dictionary. The MARSI was designed for use with individuals or groups with reading ability ranging from 5th grade to college level.
The primary uses of the MARSI include the following: (a) enhancing student awareness, (b) planning instruction, and (c) clinical or classroom research. To date the MARSI has only been validated with a sample of students enrolled only in grades 6-12. More details on the psychometric adequacy of MARSI are provided in a section to follow. (excerpt from Journal of Educational and Developmental Psychology, Vol. 1, No. 1; December 2011)
The MARSI is a self-report, online questionnaire of meta-cognitive knowledge of reading strategies. As detailed, It contains 30 items, each of which describes briefly the reading situation and the corresponding reading strategy to be applied, such as “I have a purpose in mind when I read” and “I preview the text to see what it’s about before reading it”. Each strategy is in alignment with one of the three strategy subcategories: Problem-Solving, Global Reading, and Support Reading Strategies. See Appendix A for a copy of the original measure. Participants read the questions on a computer screen and clicked on the rating that best described their use of each strategy. Each participant took approximately 10 to 12 minutes to complete the test. Participants’ responses were scored for each subscale. Participants received a raw score for each item and a mean score for each factor subscale. The mean for each factor subscale was computed by dividing the raw score by the number of questions on a given subscale. Mean scores were divided into three pre-determined levels of strategy use based on the criteria outlined by Mokhtari and Reichard (2002) for each factor subscale: high (3.5 and above), medium (2.5 to 3.4), and low (2.4 and less). An overall score was calculated by summing the raw scores across the subscales and dividing by 30. The overall score indicates how often participants use reading strategies when reading academic materials. The sub-score for each strategy subcategory indicates how often participants use Problem-Solving, Global Reading, or Support Reading Strategies during academic reading.
Scoring
Scoring the inventory is quite easy and can be done by the students themselves. Students simply transfer the scores obtained for each strategy to the scoring sheet, which accompanies the inventory. After the individual scores are recorded, they should be added up in each column to obtain a total score, then divided by the number of items to get an average response for the entire inventory as well as for each strategy subscale (i.e., Global, Problem- Solving, and Support strategies). These scores can then be interpreted using the interpretation guidelines provided.
(excerptfrom Journal of Educational Psychology, 2002, Vol. 94, No. 2, 249–259)
Interpretation
As a general rule, the overall score averages indicate how often students use all the strategies in the inventory when reading academic materials. The averages for each subscale in the inventory show which group of strategies (i.e., Global, Problem- Solving, and Support Strategies) students use most or least when reading. This information enables them to tell if they score very high or very low in any of these strategy groups. A low score on any of the subscales or parts of the inventory indicates that there may be some strategies in these parts that they might want to learn about and consider using when reading. Note, however, that thebest possible use of these strategies will ultimately depend, to agreat extent, on the students’ age, their reading ability, text difficulty,type of material read, and other related factors.
More details on the psychometricadequacy of MARSI are provided in the excerpt from Journal of Educational and Developmental Psychology, Vol. 1, No. 1; December 2011.
Timelinefor Assessment
Thefollowingtable represents thegeneraltimelineforAssessmentActivities:
Year / Semester / InterventionAssessmentPre-QEP
2011-2012 / Fall2011
Spring2012
Fall2012 / NelsonDennyReading
TestBaselineScores / PSLO AlphaArtifact
Assessmentfromall classes andevery section.
Year1
2013 / Spring2013 / NelsonDennyTrialsin
selectedclasses / PSLO AlphaArtifact
Assessmentfromall classes. / MARSI inTechnical
capstonecourses and inselectedacademic courses
Fall2013 / NelsonDenny
Pre-test–Intervention– Post-test / PSLO AlphaArtifact
Assessmentfromall classes
Year2
2014 / Spring2014 / PSLO AlphaArtifact
Assessmentfromall classes / MARSI intechnical
capstonecourses and inselectedacademic courses
Fall2014 / NelsonDenny
Pre-test–Intervention– Post-test / PSLO AlphaArtifact
Assessmentfromall classes
Year3
2015 / Spring2015 / PSLO Alpha Artifact
Assessmentfromall classes / MARSI intechnical
capstonecourses and inselectedacademic courses
Fall2015 / NelsonDenny
Pre-test–Intervention
Post-test / PSLO AlphaArtifact
Assessmentfromall
classes
Year4
2016 / Spring2016 / PSLO AlphaArtifact Assessmentfromall
classes / MARSI intechnical capstonecourses and
inselectedacademic courses
Fall2016 / NelsonDenny
Pre-test–Intervention– Post-test / PSLO AlphaArtifact
Assessmentfromall classes
Year5
2017 / Spring2017 / PSLO AlphaArtifact
Assessmentfromall
classes / MARSI intechnical
capstonecourses and
inselectedacademic courses
Fall2017 / NelsonDenny
Pre-test–Intervention–
Post-test / PSLO AlphaArtifact Assessmentfromall
classes
MetacognitiveAwareness ofReadingStrategiesInventory
MARSIQUESTIONNAIRE RESPONSE FORM
PrintYourName: / Class:Date:
Directions: Listedbeloware statements about what peopledowhenthey readacademicor school
relatedmaterials suchas textbooks, library books, etc. *Statements in this report are edited to protect survey integrity*
Fivenumbers followeachstatement (1,2,3,4,5) andeachnumber means thefollowing:
1 / "Ineveror almostneverdothis"
2 / "Idothisonlyoccasionally[JMK1]"
3 / "Isometimesdothis"(about50%ofthetime)
4 / "Iusuallydothis"
5 / "Ialwaysoralmostalwaysdothis"
Afterreadingeachstatement,writethenumber(i.e.,1.2,3,4,5)thatappliesto you.
Pleasenotethattherearenorightorwronganswerstothestatementsinthisinventory.
Insert Number
1 / …Ihaveapurpose……. / (<- 1,2,3,4,or 5)
2 / …Itakenotes…. / (<- 1,2,3,4,or 5)
3 / …Ithink…. / (<- 1,2,3,4,or 5)
4 / …Ipreview... / (<- 1,2,3,4,or 5)
5 / …,Ireadaloud…. / (<- 1,2,3,4,or 5)
6 / …Isummarize…. / (<- 1,2,3,4,or 5)
7 / …Ithink... / (<- 1,2,3,4,or 5)
8 / …Iread... / (<- 1,2,3,4,or 5)
9 / …Idiscuss…. / (<- 1,2,3,4,or 5)
10 / …Iskim…. / (<- 1,2,3,4,or 5)
11 / …Itryto…. / (<- 1,2,3,4,or 5)
12 / …Iunderlineor circle… / (<- 1,2,3,4,or 5)
13 / …Iadjust…. / (<- 1,2,3,4,or 5)
14 / …Idecide…. / (<- 1,2,3,4,or 5)
15 / …Iuse... / (<- 1,2,3,4,or 5)
16 / …Ipaycloserattention…. / (<- 1,2,3,4,or 5)
17 / …Iuse... / (<- 1,2,3,4,or 5)
18 / …Istop…. / (<- 1,2,3,4,or 5)
19 / …Iuse… / (<- 1,2,3,4,or 5)
20 / …Iparaphrase…. / (<- 1,2,3,4,or 5)
21 / …Itryto…. / (<- 1,2,3,4,or 5)
22 / …Iuse…. / (<- 1,2,3,4,or 5)
23 / …Icritically…. / (<- 1,2,3,4,or 5)
24 / …Igobackand…. / (<- 1,2,3,4,or 5)
25 / …Icheck…. / (<- 1,2,3,4,or 5)
26 / …Itryto…. / (<- 1,2,3,4,or 5)
27 / …Ire-readto…. / (<- 1,2,3,4,or 5)
28 / …Iaskmyself…. / (<- 1,2,3,4,or 5)
29 / …Icheckto see….. / (<- 1,2,3,4,or 5)
30 / …Itrytoguess…. / (<- 1,2,3,4,or 5)
Fall 2012 -- Experimental Cohort
Group Results (All Students)
Global Factor / 49 / 75%
Support Factor / 32 / 72%
Problem-Solving Factor / 32 / 79%
Total / 113 / 75%
Count of Student / 16
Spring 2013 - Cohort has no Experience with Reading Strategies
Group Results (All Students)
Global Factor / 45 / 70%
Support Factor / 30 / 67%
Problem-Solving Factor / 31 / 78%
Total / 107 / 71%
Count of Students / 52
Fall 2013 - Cohort (Merge with Spring 2013)
Group Results (All Students)
Global Factor / 45 / 70%
Support Factor / 30 / 68%
Problem-Solving Factor / 31 / 77%
Total / 106 / 71%
Count of Students / 83
Spring 2013
Began with sophomores, experimental group with faculty chosen to implement strategies.
This group will not have experienced reading strategies and will reflect how College’s general program of education imparts reading comprehension and vocabulary acquisition.
Spring / Fall 20131st Cohort Selected for Implementation
of Reading Strategies (BASELINE)
Group Results (All Students)
Global Factor / 45 / 70%
Support Factor / 30 / 67%
Problem-Solving Factor / 31 / 77%
Total / 106 / 71%
Count of Students / 135
Spring 2014 - 1st Cohort
for Comparison
Group Results (All Students)
Global Factor / 44 / 68%
Support Factor / 30 / 66%
Problem-Solving Factor / 31 / 78%
Total / 105 / 70%
Count of Students / 152
Spring 2014
Beginning in 2014, students might be exposed to faculty who, although not part of experimental intervention, might have adopted and used one or more intervention strategies
If significant rise in reading strategies is demonstrated between 2013 and 2014, assume due to increased teaching of strategies by faculty at large
Spring 2015 – 1st Cohortfor Comparison
Group Results (All Students)
Global Factor / 45 / 69%
Support Factor / 31 / 69%
Problem-Solving Factor / 32 / 79%
Total / 108 / 72%
Count of Students / 134
Spring 2015
First group to contain members who were 2013 experimental group
Data gathering needed to insure members of 2013 experimental group are identified
Should subdivide the group into those who were members of 2013 cohort and those not
No strong expectation of a true move of freshman from one year to the second year.Another expectation is to find freshman students in sophomore level courses
Spring 2016 - 1st Cohortfor Comparison
Group Results (All Students)
Global / 46 / 70%
Sup / 29 / 65%
Prob / 30 / 76%
Total / 105 / 70%
Count of Students / 154
Spring 2016
The results for spring 2016 reflect an increase in the Global factors area by 1%, yet a decrease in Support factors by 4% and a 3% decrease in the Problem Solving factors area.
Comparing the results ofSpring 2013, Spring 2014, Spring 2015 and Spring 2016, we can determine if succeeding cohorts of students are showing a trend – upward trend of scores is expected to prove QEP successful. As of Spring 2016, we find that the Global factors results have consistently increased each year to match the baseline results. However, the Problem-Solving Factors results and the Support Factors results are inconsistent from year to year and have not meet the baseline results. We can assume thatrises in reading strategies can only increase if the teaching of these strategies is adapted and used by faculty.
[JMK1]