Oklahoma State University – Oklahoma city
Graduate Surveys
Report and Analysis 2006-2011
James B Anderson
Coordinator of Assessment, Accreditation and Grant Compliance
Office of Institutional Grants and Research
8/8/2011

EXECUTIVE SUMMARY

·  Data collected since 2002.

·  Data collection managed by Institutional Grants and Research and its predecessor agencies, Institutional Research and Academic Effectiveness.

·  Data is basic from earlier in survey; however, it is becoming more robust in recent years, with breakdowns by major and by subject area.

·  A change in scoring values occurred with the 2009-2010 survey.

·  Graduates are generally happy with their educational experience; however, there has been some drop-off in the 2006-2007 and 2007-2008 academic years. This trend reversed substantially in 2008-2009 and maintained in 2009-2010.

·  There was a general increase in use of multimedia across the board with the 2010-2011 graduates.

·  Wellness center, student parking and advisement through counseling all had noticeable drops in satisfaction while financial aid experienced a noticeable increase.

·  The addition of the post graduate survey by Academic Effectiveness in 2006-2007 is providing additional data to help determine the effectiveness of Oklahoma State University-Oklahoma City.

·  Recommendation is to continue data collection and analysis.

·  Recommendation is to report the findings to the appropriate offices as reflected in the 2010-2011 trending so that they may examine potential procedural changes as reflected in the data.

BACKGROUND

Upon the creation of the Office of Retention and Assessment, one of the duties assigned to the coordinator was the development of analysis and reports of data gathered via variously deployed assessment tools. This role will continue and be augmented with the recent merger of Retention and Assessment with Academic Effectiveness and Institutional Effectiveness into the Office of Institutional Grants and Research.

The current director, Anna Royer, held and compiled the graduate survey data but had done little with it. This is primarily due to time considerations as well as competing priorities. In conjunction with the organizational change in Academic Affairs commencing in the summer of 2008, she continues compilation of the survey data and provides it to the retention and assessment coordinator annually for analysis and reporting. Effective fall of 2011, the Retention and Assessment Coordinator is now the Coordinator of Assessment, Accreditation and Grant Compliance within the Office of Institutional Grants and Research. Responsibilities pertaining to the Graduate Survey Report still fall within the purview of this position.

This report pertains to the graduate survey which is administered to each student as he or she applies for graduation. It is an instrument designed to gather data from the students pertaining to their experience at OSU-Oklahoma City. The surveys, until 2009, was scored by a ranking system. The rankings were as follows:

1.  Poor

2.  Needs Improvement

3.  Good

4.  Excellent

5.  Does Not Apply.

During her compilation, the Director of Academic Effectiveness factored out the ‘5’ answers. For instance, if she reported a question receiving and average score of 3.61, that average is based on the 1-4 scores.

Again, the delivered data began with the 2002-2003 academic year and progressed through 2007-2008 academic year. Beginning with the 2006-2007 report, responses were aggregated by division. As years go by and a better determination is made as to what data is valuable and what data is not, the survey will change accordingly. This report will outline trends in the results.

It is significantly important to recognize a change in scoring that occurs with the 2009-2010 data. While the 5 point scale still exists, the values assigned to the scale have changed. The new values are as follows:

1.  Poor

2.  Needs improvement

3.  Average

4.  Good

5.  Excellent

Because of this change there will be what appears to be a significant increase in the mean scores. This simply reflects the addition of the ‘5’ value that up until this point had been eliminated from consideration. The narrative interpretation takes this effect into consideration.

RESULTS

It is important to note that for the purposes of this report, only the mean scores were used for each area of interest. More analysis is possible. For instance one can look at the percentage of students who rated a particular service ‘3’ or above versus those who rated a service ‘2’ and below. This information can also be beneficial in interpreting the ratings. Later reports have the information separated out by academic division, a process that can be very helpful, especially in areas where questions were asked pertaining to a student’s major field of study. Such data breakouts are relatively new and will be subject of later appended reports. For trending purposes, however, a comparison of average results from year to year will be quite telling of how Oklahoma State University-Oklahoma City is faring in the eyes of its students. This can lead to an answer to a question of high interest to both the institution and its accreditation agency: Do the graduates of OSU-Oklahoma City believe that the institution is delivering the educational product that it claims to be able to deliver? The trending indicates that, for the most part, it is.

The results were divided into several areas; Major Field of Study, Social Science, Science, Math, Humanities, Distance Ed, Registration Related, and Other Services. For the first academic sections the ratings questions were the same:

o  Quality of instruction

o  Grading/testing procedures

o  Content of course

o  Use of multi-media

The information presented here will be in graphic and tabular form with some narrative explanation.

Figure 1.1

Major Field of Study

2006-2007 / 2007-2008 / 2008-2009 / 2009-2010 / 2010-2011
Quality of Instruction / 3.61 / 3.39 / 3.63 / 4.52 / 4.48
Grading/testing Procedures / 3.48 / 3.20 / 3.38 / 4.32 / 4.24
Content of course / 3.60 / 3.47 / 3.67 / 4.45 / 4.45
Use of multi-media / 3.46 / 3.38 / 3.58 / 4.27 / 4.30

Some of the immediate observations found in the Major Field of Study (figure 1.1) include the fact that all of the ratings are above the ‘3.00’ level, indicating a strong general satisfaction with the educational process. The climb in 2009-2010 to above the ‘4’ level is completely consistent with this trend. Generally, the lowest track is that pertaining to the use of multi-media. The 2006-2007 rating was 3.46 before settling at 3.38 in 2007-2008 and remains the lowest rated category through 2009-2010. That distinction was taken over by Grading/Testing Procedures in 2010-2011. The 2007-2008 ratings in all categories dropped but showed an increase with the 2008-2009 graduating class. In fact, the smallest increase (.18 points) was Grading/Testing Procedures. All others showed increases ranging from .20 points to .24 points.

The increases demonstrate a rising level of student satisfaction in these areas and reverse the downward trend seen with the 2007-2008 class. All of these trends continue through the 2009-2010 graduating class, showing either improvement or maintaining their 2008-2009 levels. The 2010-2011 class saw things a little differently. Quality of Instruction and Grading/Testing Procedures both dropped while the other two categories either maintained or increased in level. One point of particular interest is Use of Multi-media. The move by Information Services over the past several years to increase technology in the classrooms could be reflected in the upward trend in that rating. Generally, Quality of Instruction rated the highest while Grading/Testing Procedures rated the lowest.

Figure 1.2

Social Science

2006-2007 / 2007-2008 / 2008-2009 / 2009-2010 / 2010-2011
Quality of Instruction / 3.58 / 3.42 / 3.58 / 4.46 / 4.33
Grading/testing Procedures / 3.56 / 3.41 / 3.59 / 4.35 / 4.34
Content of course / 3.63 / 3.43 / 3.64 / 4.41 / 4.38
Use of multi-media / 3.57 / 3.35 / 3.57 / 4.21 / 4.23

Figure 1.2 above examines the ratings for the Social Sciences courses. The level of satisfaction, similar to the other categories, demonstrated an increase in 2008-2009 after a drop in 2007-2008. The smallest increase was .16 percentage points (Quality of Instruction) and the largest increase was .23 points (Content of Course). The levels ranged from 3.64 (Content of Course) to 3.57 (Use of Multi-media). The 2009-2010 survey results maintained these levels. Use of Multi-media is again the lowest ranked of the four areas measured. However, it needs to be noted that all areas are solidly in the satisfied category. Quality of Instruction and Course Content were the highest rated categories. The 2010-2011 graduating class saw things a little different, while still rating each area solidly in the ‘good’ category. A 2.9% drop was recorded in Quality of Instruction (4.46 to 4.33). Less substantial drops were recorded in Grading/Testing Procedures and Content of Course; however, Use of Multi-media recorded a rise, similar to that seen in the major field of study area. Again, the multi-media field has enjoyed a steady rise over the years.

Figure 1.3

Science

2006-2007 / 2007-2008 / 2008-2009 / 2009-2010 / 2010-2011
Quality of Instruction / 3.25 / 3.20 / 3.39 / 4.04 / 4.19
Grading/testing Procedures / 3.24 / 3.21 / 3.32 / 4.12 / 4.14
Content of course / 3.34 / 3.33 / 3.46 / 4.26 / 4.34
Use of multi-media / 3.21 / 3.26 / 3.40 / 3.96 / 4.12

The science chart, figure 1.3, presents some interesting and, up to this point, unique trending. The trend of increasing satisfaction over previous years continued through the 2008-2009 graduating class. Even though the increases for science were not quite as substantial as the other subject areas, they still bear mentioning and acknowledgment. The 2009-2010 academic year represents a relative drop in ratings across the board. While all remained in the good category, with the exception of multi-media, the level of improvement seen in 2008-2009 was not evident. At the very best each area maintained its 2008-2009 levels, Course Content being the closest to that point. Some dropped in relative levels, multi-media being the most evident example. The 2010-2011 class again brought unique trending when compared with the previous categories. There was a rise in all four areas for Science. The largest gain, as with Social Science and Major Field of Study, was in Use of Multi-media. Here, however, there was a 4.0% increase from the previous year.

Figure 1.4

Math

2006-2007 / 2007-2008 / 2008-2009 / 2009-2010 / 2010-2011
Quality of Instruction / 3.33 / 3.26 / 3.40 / 4.15 / 4.10
Grading/testing Procedures / 3.41 / 3.23 / 3.51 / 4.11 / 4.09
Content of course / 3.47 / 3.28 / 3.54 / 4.14 / 4.09
Use of multi-media / 3.41 / 3.18 / 3.39 / 4.13 / 3.95

Math trending reverts more closely to the Social Science trend, as seen in Figure 1.4 on the previous page. As with the other subject areas, Math also enjoyed a strong rebound in satisfaction with the 2008-2009 graduates. In fact, Grading/Testing Procedures and Content of Course received the highest ratings (3.51 and 3.54 respectively) since 2002. Use of Multi-media increased to 3.39 (very close to its highest rating of 3.41 in 2006-2007) and Quality of Instruction increased to 3.40. The 2009-2010 results indicate perhaps a small drop off in relation to the previous years; however, the rating for all areas is good. The range is quite tight from 4.11 for Grading/Testing Procedures to 4.15 for Quality of Instruction. The 2010-2011 graduation class continued the slight downward trend. Drops ranged from 3.6% (Use of Multi-media) to .4% (Grading/Testing Procedures). Math represents the first downward trend in Use of Multi-media. It is also the first in the comparisons so far to show a drop in all four areas measured.

Figure 1.5

Humanities

2006-2007 / 2007-2008 / 2008-2009 / 2009-2010 / 2010-2011
Quality of Instruction / 3.51 / 3.45 / 3.60 / 4.35 / 4.29
Grading/testing Procedures / 3.50 / 3.43 / 3.62 / 4.40 / 4.19
Content of course / 3.50 / 3.44 / 3.61 / 4.43 / 4.34
Use of multi-media / 3.40 / 3.39 / 3.62 / 4.29 / 4.24

The Humanities, figure 1.4 on the previous page, continued the trending seen in other areas of bottoming out in 2007-2008. As in the other subject areas, an increase in graduate satisfaction appears in 2008-2009. All of the areas increased with Use of Multi-media leading the way at 3.62 (a 6.7% improvement). The range among the four areas measured has remained relatively tight for the entire period of measurement. This trend continues in 2009-2010. All areas are held in high regard by the graduates. However, relatively substantial drops occurred with the 2010-2011 co-hort. All four areas recorded drops with Grading/Testing Procedures falling the farthest from 4.40 to 4.19 (- 4.7%). Again, use of Multi-media continues the relatively positive trend of having the smallest drop (4.29 to 4.24 or – 1.1%).

It remains to be seen if the almost across the board drop in satisfaction among Math, Humanities and Social Science is a trend or an aberration. History does suggest that aberrational results are possible, as seen in 2007-2008; however, only 2011-2012 results can lend support to this line of thinking. If the downward trend continues, then deeper examination of potential causation is warranted.

Figure 2.1

Distance Ed