8480 Program Evaluation

Carla Gregory

Fall 2010

Program Evaluation of iRespond Devices

Prepared by: Carla Gregory

Prepared for: 8480 Program Evaluation

Fall 2010

Executive Summary

Many schools are using Student Response Systems (SRS) in classrooms. The SRS provides a pedagogical tool that furnishes on the spot information from student responses, enabling the teacher to evaluate the accuracy of the response and have the opportunity to immediately correct misunderstandings and reinforce conceptual knowledge” (Deleo, 2009). This year funds were used to purchase SRS devices. The particular brand bought for my school is iRespond manufactured through RM Education. My principal wanted a way to evaluate the usefulness of the systems in increasing lifelong learning, student achievement, motivation, engagement, and attendance. Therefore, I used several different methods to evaluate the use of iRespond at my school to see if there is value in using the SRS in the classroom. The data collected determined that teachers needed more trainings and more support form technology services in making sure the iRespond units were ready to be used.

The evaluation addressed the following questions: How do iRespond devices enable students to better self-reflect and learn from mistakes? Why do iRespond units keep students actively engaged in learning? How could using iRespond units increase student attendance on days when they are used? Why might students’ tests and quiz grades improve after using iRespond units in the classroom?

This report presents findings from 5 different teachers who taught one or more of the following subjects: social studies, science, or language arts in grades sixth-eighth. There were five different methods used to gather data: an open-ended survey related to evaluation question one, a Likert-Scale survey providing data for evaluation questions one and two, an observation checklist that provided data for evaluation questions one, two, and four, attendance information that provided data for evaluation question three, and common assessment data that provided information addressing evaluation question four.

Findings suggest that iRespond units enable students to self-reflect. However, the level of self-reflection was different based on how the unit was used in the classroom. It did keep students actively engaged, but there was not enough data to conclude it was better than other tools used in the classroom. There was not enough data that showed iResponds improved grades. This was probably due to the different levels of the students. It did not show significant increases in attendance.

Recommendations are for further trainings to occur to encourage more teachers to use the iRespond units. However, due to significant technology glitches, they should not be used on benchmark tests yet. There were many limitations to this study including lack of data from all subjects and grades. Limitations also included teachers not disclosing when they would use the iRespond devices that possibly affected the outcome for attendance data. A final limitation was the level of the student. Some classes had gifted students while others had students who could speak limited English which limits the reliability of test score averages on common assessments.

Program Evaluation of iRespond Devices

Introduction

Many schools are using Student Response Systems (SRS) in classrooms. These SRS units are a “combination of hardware and software designed to provide the instructor immediate feedback from students. Each student gets a handheld instrument that emits a signal to the base station. The SRS provides a pedagogical tool that furnishes on the spot information from student responses, enabling the teacher to evaluate the accuracy of the response and have the opportunity to immediately correct misunderstandings and reinforce conceptual knowledge” (Deleo, 2009). “Teachers are able to direct students’ attention to key concepts through the use of student response systems questions in class. The instant feedback also allows teachers to evaluate whether students understand material and then to tailor lectures toward concepts with which students seem to be struggling” (Liu, 2010).

“A growing body of literature on the use of SRS proclaims the benefits of greater participation and increase emotional engagement” (Stowell, 2010). “Increased engagement predicts improved achievement and is a hallmark of a productive classroom” (Blood, 2008). Studies show that students “demonstrated higher scores on weekly quizzes when the student response system was in use” (Blood, 2008). Studies further show that students who usually refrain from answering questions in a classroom environment would be more willing to answer using a SRS because their answers will be anonymous (Stowell, 2010). Students found the advantages to SRS include instant feedback and it helped them to stay focused and engaged (Koenig, 2010). They also felt that it “increased their confidence” since immediate feedback was provided to them during class time from formative question posed in class where they answered using the SRS before they took the actual test or quiz (Blood, 2008). Teachers found that it created “a positive change in their instruction with increased focus on student-centered learning” as well as “a dramatic increase in student attendance” (Koenig, 2010). Teachers found classes “more enjoyable” because interaction “is now possible with the entire class” and not just the outgoing students (Koenig, 2010).

The organization evaluated for my study was Smitha Middle Schoolwhich includes grades 6-8. Eighty-seven percent of the population served are African American or Hispanic. Over 85 % of our students are on free and reduced-price lunch. Because of this fact, we are a Title I school and receive Title I funds. This is the first year that we are not on the state needs improvement list. This means that sixth grade parents in our district did not have a choice to go to another school. It also meant that students in 7th and 8th grade who previously went to another school could continue going there, but would no longer have transportation provided for them. Therefore, our total population has gone from 820 students to 903.

I, Carla Gregory, was the evaluator. The evaluation client was my principal, Sharon Tucker. The stakeholders were five teachers and 440 students who were in their classrooms that usediRespond units.

Purpose

Our county receives Special Local Option Sales Tax (SPLOST). This year this money was used to purchase student response system units for all middle schools. The particular brand bought for my school wasiRespond manufactured through RM Education. My principal wanted a way to evaluate the usefulness of the systems in increasing lifelong learning, student achievement, motivation, engagement, and attendance. Therefore, Ievaluated the use of iRespond at my school to see if there is value in using the SRS in the classroom. Because teachers have not received all trainings provided by RM education, this was a formative evaluation of the current use. The evaluationdetermined if the iRespond will be used on benchmark tests this year. The evaluation also decided how many more trainings teachers will need to fully implement the devices.

Evaluation Objectives and Questions

The objectives of this evaluation determined if the iRespond units were increasing lifelong learning (self reflection), student achievement (common assessments), attendance, and engagement. This information would then be used to determine if more trainingson iRespond units would benefit teachers and whether the data supports teachers using them on benchmark tests. Therefore, the evaluation addressed the following evaluation questions:

  1. In what ways do iRespond devices enable students to better self-reflect and learn from mistakes?
  2. How do iRespond units keep students actively engaged in learning?
  3. Doesusing iRespond units increase student attendance on days when they are used?
  4. Dostudents’ tests and quiz grades improve after using iRespond units in the classroom?

Methods

Participants

Therewere five participants in the evaluation who usediRespond units on a regular basis in their classroom. One teacher taught social studies to students with limited English speaking proficiency (ESOL) in all grades, another taught 6th grade science, one taught 8th grade science, one 8th grade social studies,and another taught 6th grade language arts.

Design and Procedures

Strategy to evaluate objective 1: Three types of data provided information answering this objective. First, an open-ended questionnairewith two questions was given to teachers that served as a formative evaluation to help with determining if the iRespond units would be used on benchmark tests and how it enabled students to self-reflect. This information was determined using a fidelity matrix (see Appendix p.17). The questionnaire responses wereplaced into appropriate categories to determine if students are learning lifelong skills that relate to self-reflection and learning frommistakes (situational fidelity), how the program was executed and how this played a role in their ability to self-reflect (executional fidelity) and finally, if support aspects influenced the data (behavioral fidelity). Second, data was collected from the observation checklist. Responses from question 4 and 5 were coded to determine the mode or most frequent answer. Finally, data was collected from the Likert-scale survey using responses from questions three, four, and six to determine the mode or most frequent answer.

Strategies to evaluate objective 2: Two types of data were collected to determine the accomplishment of this objective. First, I observed teachers and students using the iRespond units in classrooms. During my observation, I found the percentage of the students that were off task to provide quantitative data using question one from the checklist. I also found the mode using questions three and four. Second, teachers were given a Likert-scale survey with questions related to this objective. I found the mode for questions one, two, five, and seven to determine the most frequent response which provided quantitative data. Therefore, data collected for this objective provided formative data that determined how to better engage students in learning.

Strategies to evaluate objective 3: Averages were obtained that showed student attendance on days when iRespond was used in comparison with days when it was not used.

Strategies to evaluate objective 4: Data collected determined averages that compared common assessment test scores of students who used iRespond units versus those who did not across various subjects and grades. Data was also used from the observation checklist-question number 7. I calculated the average number of students who received a score that was 80 percent or higher using data from question number seven.

Instruments (Note: all instruments are found in the appendix)

Two-question open ended survey: This instrument asked teachers to reflect on whether students are able to complete self-reflections and learn from mistakes as observed when using the iRespond during lectures and tests. It also had them comment on ease of use, support received, and comments student made about using iRespond

Likert survey: There are seven questions which teachers used a Likert-scale to answer. Questions three, four, and six of this instrument helped determine if students are more engaged in classroom learning and activities when iRespond units are used(evaluation question one). Questions one, two, five, and seven helped determine if iRespond units were enabling students to learn from mistakes.

Observation Checklist: This instrument was used during observations in the classroom to determine if students were actively engaged (questions one, two, three, and six on the instrument) and were learning from any misconceptions they made (questions four and five on the instrument). It also focused on the scores made by students (question seven on the instrument).

Attendance Data Collection Instrument: This instrument was completed by teachers to determine if attendance is affected (positively or negatively) when using iRespond units in class. They will provide the number of students in each class with a total number for each class identifying days and classes they used iRespond.

Common Assessment Data Collection: These instruments were used to collect test data collected on common assessments created by teachers. Common assessments are used by all teachers in the particular subject and grade. Some teachers used the iRespond unit for students to answer questions on the assessment while others did not. Test scores were compared to determine whether using iRespond increased test scores.

Table 1: Instruments used in the iRespond Evaluation

Objectives/Questions / Instrumentation
Open Ended Survey / Likert Scale Survey / Observation Checklist / Attendance Collection
Data / Common Assessment Data
Evaluation Question 1: Students ability to self-reflect and learn from mistakes / X / X / X
Evaluation question 2: Ability of iRespond to keep students engaged / X / X
Evaluation question 3: Ability of iRespond to increase student attendance / X
Evaluation question 4: Increase of student test scores / X / X

Summary of Key Findings: (More detailed data can be found in the appendix on pages 19-22)

Evaluation Question 1: In what ways do iRespond devices enable students to better self-reflect and

learn from mistakes?

Key Findings: Based on the data collected, the evaluation showed that there was a high level of situational fidelity. Unfortunately the behavioral and executional fidelity levels were low. Teachers reported that there were many technical glitches such as rosters not showing up or remotes not working that kept them from effectively using the remotes when they planned. They also responded that they had not received enough training or support to effectively use the devices. Based on the mode on questions 3, 4, and 6 on the Likert-Scale survey, the iRespond devices do allow the students to self-reflect since they provide immediate feedback. It also allows the teacher to reteach misconceptions immediately which also allows students to further reflect and learn from their mistakes. However, there was not enough data supporting that students are able to better self-reflect when using iRespond to when they are not used. Based on the mode on questions 4 and 5 using the observation checklist, the teachers were also providing feedback which allowed students to self-reflect as well as students being able to answer what they did wrong and what the correct answer should be.

Evaluation Question 2: How do iRespond units keep students actively engaged in learning?

Key Findings: Based on my observations as well as teacher opinions on the Likert-survey, over 80 percent of the students were actively engaged and on task. Most teachers also saw a difference in engagement from paper and pencil tests (less engagement) versus using the iRespond (more engagement). When I went back to teachers and asked their thoughts on why this was true, they felt it was because students were getting to use technology and it was something new and interesting to the students. They were responding quickly to questions and still getting them correct. Most teachers did not have to go to the next question and leave students behind. When using questions within a PowerPoint presentation, students were asking more questions and adding to their knowledge base.

Evaluation Question 3: Does using iRespond units increase student attendance on days when they are used?

Key Findings: Once data was averaged for student attendance, no significant difference was found in days that students used iRespond versus days when it was not used. (I chose not to include the specific data in this report since no significant differences were found)

Evaluation Question 4: Do students’ tests and quiz grades improve after using iRespond units in the classroom?

Key Findings: Three of the five classes who used iRespond had higher averages in comparison with a class that took the same common assessment. However, the overall average for all classes using iRespond versus those that did not was lower. Also, during my observations three of the five classes using iRespond did not have all students receiving 80 percent scores.

Recommendations and Conclusions:

Based on the data, the iRespond units do have the ability to encourage students to self-reflect by providing immediate feedback. However, there are three ways for teachers to use the iRespond units: 1) Teacher-paced which allows for more feedback as the teacher controls when students can go to the next question and once answered it lets them know if they answered it correctly and if not what they answer is before they can go to the next question 2) Student paced which provides a score to students and they have the ability to change answers without knowing which questions are wrong 3) Power Presenter-allows the teacher to create questions in a Powerpoint which can be used while the teacher is introducing or reviewing a concept to make the lesson more interactive s questions can be throughout the presentation and not just at the end or beginning. Because there are so many different ways to use the iRespond devices, this lends itself to different levels of student self-reflection.