Annual Report on Assessment of Degree Programs, AY 2007-08

Name of Program: / Political Science / College: / COAS
Prepared by: / Staci Beavers, PSCI Chair (contact)*** / Date: / May 30, 2008
Department Chair/Program Coordinator
Email Address: / / Extension: / x4194
PART A (Required by May 22, 2008 – last day of Spring semester)
1) / Please describe the student learning outcomes you focused on for assessment this year, the assessment activities you used to measure student learning in these areas, and the results of your assessments. Please also comment on the significance of your results.
SLO: 6) Demonstrate working knowledge ofresearch methods by applying said methods to critically
analyze political phenomena.
ACTIVITIES:
PSCI 301, our research methods course, is currently rotated among tenure-track faculty
Shana Bass and Scott Greenwood. As the instructor for the Spring 2008 section of PSCI
301, Dr. Shana Bass shouldered the great majority of the responsibility for this project.
She took the initiative in developing a pre-test/post-test assessment exercise that allowed students to demonstrate their working knowledge of research methods by developing their own research designs in
the first week of the semester and with the Final Exam (instrument attached). She also developed the evaluation rubric that she and Dr. Greenwood utilized in examining the students' submissions
(attached).
Thirty-two students completed the pre-test administration of the exercise, and 34 students completed
the post-test version. (Note: Five students were repeating the class.) Students were made aware of
the inclusion of this instrument on the Final Exam with the exam’s study guide.
RESULTS:
Most students scored very low on the pre-test version of the exercise. Out of 21 possible points needed to be “proficient,” the average overall score was 9.8, shy of the 12 score for adequacy. The data indicate that some students had some idea how to do research going in (which could be because half the class were seniors), but many were in dire need of training. Students generally showed limited ability across most Embedded Learning Objectives (attached). While many pre-test students could adequately formulate a hypothesis, correctly identify an appropriate methodology, and formulate survey questions, pre-test students had difficulty identifying variables, formulating research questions, sampling, and drawing connections between data and hypotheses confirmation/disconfirmation.
On the post-test at the end of the course, most students scored as at least “adequate” on the overall research design. Out of 21 possible points to be “proficient,” the average overall score was 17.7, a marked improvement over the 9.8 pretest average, and halfway in between the score of 12 for “adequate” and 21 for “proficient.” Students showed improvement across all Embedded Learning Objectives, but they demonstrated the highest levels of proficiency in formulating research questions and hypotheses and identifying an appropriate research methodology. Overall, the post-test demonstrated that the students markedly improved their understanding and mastery of what methodology is and how it should be used for social scientific research. However, the post-test also showed that students continued to struggle with writing survey questions, clearly identifying dependent and independent variables, and drawing connections between data and hypothesis confirmation/disconfirmation.
Significance:
First, while a few students may have entered the course with some idea about how do conduct research in political science, only 5 students out of 32 who completed the pretest scored at or above the adequate level (12), with the highest score being 16. The vast majority of students need a significant amount of instruction in the area of research methods, including the 5 students in the class who have taken PSCI 301 before. Second, as indicated by the overall difference between the pre-test and post-test averages, we are pleased that through the PSCI 301 course, students are learning to design their own research projects. While students still struggle with significant areas of research design, they do show a large leap in understanding the overall process of developing and executing a research project. Third, despite increased ability to formulate research questions and hypotheses, students still have trouble identifying independent and dependent variables and drawing connections between data and hypothesis confirmation/disconfirmation. It is in this area that our efforts will be focused in the PSCI 301 course next Fall.
2) / How did your program utilize any resources provided for assessment this year? Please attach a budget with specifics.
We were provided funds to send Dr. Scott Greenwood to the American Political
Science Association's "Teaching & Learning Conference," where one track specifically
addressed the teaching of research methods. Dr. Greenwood learned of a strategy that Dr. Bass will likely incorporate in next year’s iteration of PSCI 301—students will construct their own survey, to be compared/contrasted later with professional polls so that students can examine the strengths and weaknesses of their own creations.
3) / As a result of your assessment findings, what changes at either the course- or program-level are being made and/or proposed in order to improve student learning? Please articulate how your assessment findings suggest the need for any proposed changes.
Course-level Changes:
Students appear to need more practice in identifying variables and understanding the connection between variables, research questions, and hypotheses. Professors Bass and Greenwood plan to discuss new ways to approach and teach this material so students will develop a clearer understanding of independent and dependent variables and their relationship to each other, as well as the relationship between the variables and the research question, hypothesis and data.
Further, based on the idea that, while, students may understand the research tools in theory, they may need more practice applying and using their research tools, new approaches may include individual and/or group in-class exercises, homework assignments, and/or quizzes focused on a 4-part simplified research design: formulating a research question, formulating a hypothesis, identifying the independent and dependent variables in the hypothesis, and then drawing connections between the data findings that would confirm and disconfirm the hypotheses. We could pursue activities for each of these 4 parts independently and then build to exercises where students do all 4 parts together. We can further assess the utility of our new approach(es) using a quiz, homework assignment, or application question on an exam – asking students to identify independent and dependent variables and also to identify what data would confirm/disconfirm the hypothesis(es).
Further, we plan to revise the survey research project for the class to focus more directly and clearly on the relationship between independent and dependent variables, testing alternative hypotheses, and the broader relationship between variables and research questions and hypotheses. In past versions of this assignment, student teams created, implemented and analyzed their own complete survey, choosing their own topics, research questions, and dependent and independent variables. This project was very creative, labor-intensive, and rewarding for students, but a simpler, more structured assignment will allow us to focus more specifically on research questions, variables, and hypotheses.
In Fall 2008 (a presidential election year) we will attempt to implement one class-wide survey/exit poll analyzing one clear dependent variable (presidential vote choice in 2008) and have student teams each formulate, create a survey question for, and analyze a different hypothesis using a different independent variable to explain presidential vote choice. Each team will present and discuss the results of their hypothesis to the class, and as a class we can discuss the merit of the several competing explanations for the same dependent variable. Students will then write a paper analyzing the results and discussing the strength of the various hypotheses.
PART B Planning for Assessment in 2008-009
(Required by Friday, September 19, 2008;
May be submitted earlier for Expedited Funding Decision)
4) / Please identify one or two student learning outcomes that your program will focus on for assessment next year.
Dr. Bass has expressed a willingness to continue her work in assessing student learning in PSCI 301.
She will cover this course again in the fall and will continue the assessment of PSCI students' learning
in this course. Relevant SLO: 6) Demonstrate working knowledge ofresearch methods by applying
said methods to critically analyze political phenomena.
5) / What specific assessment activities will you conduct next year in order to measure program student learning in these areas
Again, we want to continue examining student learning in PSCI 301, building on the existing work of Drs. Bass and Greenwood. As noted above, this will continue with assessments of demonstrations of students’ working knowledge of research design; this will likely include a version of the strategy learned of by Dr. Greenwood at the Teaching & Learning Conference. Additionally, we can assess the utility of our new approaches and assignments and assignment revisions discussed above using a rubric to assess a quiz, homework assignment, or application question on an exam, again asking students to identify independent and dependent variables and also to identify what data would confirm/disconfirm the hypothesis(es).
6) / What new or additional resources/support might your program need in order to conduct these assessment activities next year? (Please provide specific information regarding your needs and related costs)
Book purchase: Deardoff, Hamann, and Ishiyama (eds.) (2008). Assessment in Political
Science. (To be available Summer 2008. Estimated cost: $100).
Photocopying: Multiple "clean" copies of student work must be available to reviewers, and we
Also requesting funding for the dissemination of copies of assignments, etc., to students. The current cap for this course is 30 students. (Requested: $75)
Stipend: Dr. Shana Bass will once again be carrying the bulk of the burden of this project.
We are requesting a stipend of $825 for her efforts. (NOTE: Dr. Bass has indicated an interest
in attending an assessment-related conference, should she be able to find a relevant conference
next year. In that event, we may ask that some of these funds be shifted to travel for that purpose.
Dr. Bass is currently investigating this possibility.)

***Staci Beavers served solely as departmental contact for the project. The real work on this project was done by Shana Bass (Spring 2008 instructor for PSCI 301 and primary author of this report) and Scott Greenwood (who rotates this course with Dr. Bass and assisted with instrument development, review of student work, and preparation of the final report). Drs. Elizabeth Matthews and Pamela Stricker also served on the Department's Assessment Committee and contributed to the development of our new SLOs and the assessment project.

Academic Programs/DB Page 2 of 4