EXPLORING STUDENT RESPONSE TECHNOLOGY: iRESPOND 2

Running Head: EXPLORING STUDENT RESPONSE TECHNOLOGY: iRESPOND

Evaluation Plan

Exploring iRespond Utilization at Hightower Trail Middle School

Carlene Bailey

MEDT 8480

University of West Georgia

Introduction

Clickers have become a fact of life in Cobb County Georgia classrooms. These tools offer students immediate feedback and can reduce the time teachers spend grading papers. In addition to these obvious benefits there are still questions as to whether incorporating this technology provides for enhanced student learning or is just another gadget to complicate an already cluttered learning environment.

The “clicker” currently being utilized is the iRespond-Lite. Information is available at http://www.irespond.com. This tool has the capability of allowing students to respond to the following types of questions: multiple-choice, true/false, multiple-response, yes/no, content item, numeric fill-in-the-blank (by using the numeric keypad), survey, and up to five answer choice capability using multiple choice or multiple select type questions.

The benefits of incorporating this technology that proponents often mention are numerous and promising. After review of the available literature on this topic several conclusions have already been stated. Results in the literature were varied in that some referred to the enjoyment students and teachers get from use of personal response systems. Kenwright (2009) explained that students were more likely to respond and participate with the use of this personal response tool. Others focused on how they impacted attendance by making learning fun and engaging (Kolikant, Drane, & Calkins, 2010). One of the studies was different in that it suggested that uses of the system should not expect instantaneous results and that change only happens over a period of time (Kolikant et al., 2010).

This evaluation will explore the implementation and effectiveness of this technology in the middle school classroom, specifically at Hightower Trail Middle School in Cobb County, Georgia.

Hightower Trail Middle school is a large suburban middle school in Marietta, Georgia with a strong focus on excellence.

The evaluation client of this process will be Laura Montgomery. Laura is the assistant principal of Hightower Trail Middle School (HTMS). She has served in this role for five years and was an instructor at the same school for many years prior. She is a strong advocate for improving the learning environment through both student and teacher success.

The mission of this evaluation is to initially explore iRespond as a tool to enhance the learning experience, and hopefully as a result, better understand what it takes to effectively incorporate other exciting new technologies in the classroom.

Purpose

The purpose of this evaluation is to determine the benefits and effectiveness of the iRespond system and its implementation as it is being utilized at HTMS. The final conclusions of this research will provide insight and direction for future technology implementations and offer suggestions for teachers and administrators that do not fully understand the possible benefits or drawbacks of this student response technology.

Does the iRespond tool promise more than it delivers, or is the success of a student response tool, such as this, largely dependent on teacher acceptance and eager utilization? Students are often quoted as enjoying and looking forward to classes when the system is used.

Research is vital to explore the effective use of any technology. This evaluation project will be helpful in guiding and focusing the use of the iRespond tool as we explore the link between teacher training, technology, and ultimately improved student outcomes. As Levin & Hansen (2008) explained, “Some training may be necessary… Instructors should discuss how the course technology is relevant, or useful, to the students. The more likely students view the technology to be useful or relevant, the better the students will perform in the course” (p. 7).

Students and teachers who are thinking about using any personal response system can benefit from the information derived from this evaluation. School systems at large engaged in making decisions about whether or not to purchase systems such as this would also benefit from this information. Technology may seem appealing and effective at first glance, but without careful analysis and evaluation, schools could stand to sacrifice valuable resources on the latest fad.

Evaluation Questions

How effective was the iRespond system implementation at providing teachers with necessary preparation and instruction for a successful launch, and in what ways could the implementation been improved? In what ways does the iRespond system enhance or distract from student learning? What additional uses of the iRespond system have been discovered by the teachers beyond the initial required assessments? Are there differences in the effectiveness and utilization of the iRespond system based on grade level or subject content?

Another area that is of interest from this activity explores the barriers to technology implementation from the point of view of the instructor. Does the instructor have resistance to change? Are there fears and hesitancy towards the implementation of any new technology? The best technology in the world is worthless if the person charged with utilizing it believes it to not have value.

Methods

The participants in this research study will be the academic instructors at HTMS. This year, all academic instructors received a classroom set of iRespond remotes with necessary software and hardware to utilize it in the classroom. Some required assessments were expected of them and all the instructors were instructed in its setup and use. The instructors will be sent a letter (Appendix B) and encouraged to complete an online survey to provide answers to the evaluation questions.

Schedule of Tasks:

Item to be Completed / Date
Evaluation Questions / February 23, 2011
Evaluation Plan (includes the following)
Background Information on Program to be Evaluated
Evaluation Questions (previously agreed upon)
Sampling Plan
Evaluation Instruments
Data Collection Plan / March 7, 2011
Completion of Data Collection / March 18, 2011
Draft Report for Evaluation Client to Review / March 30, 2011
Final Evaluation Report / April 11, 2011
Presentation to Evaluation Client and Organization (optional according to agreement with Evaluation Client) / April 11, 2011

Follow-up procedures to insure more complete data collection will involve a second sending of the email to non-participants as well a face-to-face interviews with select teachers should adequate representation not be achieved.

Upon final approval of the evaluation findings, the report will be posted on a Wiki page and links to this page will be sent to all participants that provide an email address on the survey instrument.

Evaluation Instruments

The evaluation instrument (Appendix A) can also be viewed online here. This instrument is a Google Docs survey form and all data is stored in an online spreadsheet. The online nature of the documents allows for easy distribution and assimilation of the data.

Questions provide for both quantitative and qualitative information and can be divided into four categories. Some questions set to establish the background information about the participant such as grade level taught, years of experience, and subject area. A second focus is on the effectiveness of the iRespond implementation, while a third topic is how effective the tool can be in a learning setting. Finally the fourth topic focuses on attitudes possessed by the participant in regard to technology implementations in general.

The structure and order of the questionnaire may seem to be disorganized at first but several questions with the same focus are asked in different question formats to provide for response validity. Questions requiring a more thoughtful response are scattered in the survey so as not to overwhelm the participant. Multiple-choice questions allowing only one response and rating questions should provide hard numeric data to help answer the evaluation questions. Open-ended responses will hopefully provide anecdotal evidence, while more difficult to analyze should be a source for helpful ideas.

Data Analysis

This evaluation project will use a mixed approach and collect both quantitative and qualitative data. Validity of the data collected will be achieved through the use of a large population based on the fairly limited scope of the project. All academic instructors at Hightower Trail Middle School received and have used the iRespond system. The survey data will come from this entire population. Additional validity will be accomplished through multiple data sources in the same instrument. Key questions are asked in different question formats providing for evidence that responses are valid and thoughtful.

Data will be collected in a spreadsheet and imported into a database format allowing for effective filtering and reporting. The overall population results of the key questions regarding effectiveness of iRespond, teacher preparation, and the teachers’ desire to utilize will be tabulated and presented in bar graph format.

Relationships between years of service, subject taught, and grade level will be analyzed in contingency tables versus tool utilization, technology effectiveness, and preparation adequacy.

Inductive categories may be used to classify anecdotal responses depending on the quantity of such data. For example students may respond with may varied comments, yet the ultimate opinion is that the experience was fun. Teacher suggestions for additional iRespond uses could be classified in a similar fashion.

Conclusion

The information from this evaluation will be presented directly to the evaluation client for approval. The approved final copy of the report will be posted on a wiki page with links made available to all participants for review and comment. The evaluation client may wish to share the findings with the technology department at the board office for use in future technology implementations.

Program evaluation is critical to help understand new technologies. This evaluation project will provide insights helpful in the continued use of the iRespond tool as well as in exploring the links between teacher training, technology, and ultimately, student success.

References

Berry, J. (2009). Technology Support in Nursing Education: Clickers in the Classroom. Nursing

Education Research, 30(5), 295-298.

Campbell, J., & Mayer, R. E. (2009). Questioning as an Instructional Method: Does it Affect

Learning from Lectures? Applied Cognitive Psychology, 23(1), 747-759.

Cole, S., & Kosc, G. (2010). Quit Surfing and Start “Clicking”: One Professor’s Effort to

Combat the Problems of Teaching the U.S. Survey in a Large Lecture Hall. The History

Teacher, 43(3), 397-410.

Kaufman, R., & Guerra, I., & Platt, W. A. (2006). Practical Evaluation for Educators. Thousand

Oaks, California: Corwin Press.

Kenwright, K. (2009). Clickers in the Classroom. TechTrends, 53(1), 74-77.

Kolikant, Y. B., & Drane, D., & Calkins, S. (2010). “Clickers” as Catalysts for Transformation

of Teachers. College Teaching, 58(1), 127-135.

Levin, M. A., & Hansen, J. M. (2008). Clicking to Learn or Learning to Click: A Theoretical and

Empirical Investigation. College Student Journal, 42(2), 1-11.

Nagy-Shadman, E., & Desrochers, C. (2008). Student Response Technology: Empirically

Grounded or Just a Gimmick? International Journal of Science Education, 30(15), 2023-

2066.

Appendix A

iRespond Implementation and Effectiveness Survey

Thank you for taking the time to fill out this brief survey. You may participate in this survey and remain anonymous if you like. I am conducting this research project as part of a course in program evaluation at the University of West Georgia. This survey will only be completed by teachers at Hightower Trail Middle School and will look at the effectiveness of the iRespond student response tool. Information collected will be available for review upon completion of this project. So please take a moment and share your opinions as your ideas may be helpful to the rest of us. Thanks again, Carlene Bailey 7th Grade, HTMS

1. What grade level do you teach?

6th grade

7th grade

8th grade

Other

2. What subject area do you teach?

Language Arts

Mathematics

Social Studies

Science

Other

3. Chose the response below that best describes you and your relation to new technologies.

Technology and I don't get along very well. It takes too long to implement with limited benefits.

I have found some success with incorporating technology, but it is a slow and time consuming process.

I like to use technology and the latest tools in my classroom, but other factors prevent me from utilizing it more.

My use of technology in the classroom seems to be just right at this time.

Other

4. Have you personally set-up and used the new iRespond student response system?

Yes, and I have completed the required assessments.

Yes, and I have used the tool for additional learning activities beyond the required assessments.

No, but I was involved with another teacher that set-up and used the iRespond.

No, I haven't used the tool at all.

5. Was the iRespond system presented to you with adequate information and training to easily implement it in your classroom?

Yes, training on the iRespond was adequate and classroom set-up was easy.

Yes, the information seemed complete, but set-up and utilization was still challenging.

No, I needed more training and practice prior to using the iRespond.

The training was adequate but the written instructions were confusing.

Other

6. How many years have you been teaching?

1-3

4-10

11-20

21+

7. What could have been done differently to make the implementation of the iRespond system easier and more effective? This is not a required question. Only answer this if you feel the iRespond implementation could have been improved.

8. Did your students enjoy using the iRespond system?

Yes, they were delighted and can't wait to use it again.

Yes, they seemed intrigued but haven't been asking to use it again.

No, they were confused and frustrated with the tool.

No, they did not seem either negative or positive about using iRespond.

Other:

9. Check as many of the following statements that apply to you and your students. You can check as many or as few of these statements as you wish.

My students were engaged and excited to use the iRespond.

I feel that the iRespond can invigorate the learning process by providing immediate feedback to the students and the teacher.

The iRespond is just another technology gadget that will not replace more traditional teaching methods.

Other than taking a quiz, I see no benefit to using iRespond.

I think this tool could be used during traditional lecture classes to engage students' attention and retention of the material discussed.