University of South Carolina Columbia

Act 629 - Summary Reports on Institutional Effectiveness

Fiscal Year 2001 - 2002

Introduction

This report includes: Assessment of General Education, Majors/Concentrations and Policies and Procedures to Ensure that Academic Programs Support the Economic Development Needs in the State by Providing a Technologically Skilled Workforce.

A reporting schedule for USC Columbia and Regional Campuses institutional effectiveness components can be found at

Assessment of General Education

Philosophy

Two basic principles are at the heart of student assessment at USC:

*As much assessment as possible should be performed at the program level (a BA or BS in Psychology is one program, a Masters in Psychology is another program, etc.) At a large university, changes are more likely to occur at the program level, as compared to institution-wide changes. Student outcomes can be tailored to the program, rather than developing a "one size fits all" system of student outcomes. However, acceptable levels of quality for general education must be agreed upon at the institutional level, to ensure that all students receive a well-rounded education.

*The Nine Principles of Good Practice for Assessing Student Learning , as developed by the American Association for Higher Education, should be followed in all assessment endeavors. These principles are included in the Assessment Plan Reality Checklist that is distributed during assessment workshops.

Institution-level Assessment of Student Outcomes for General Education

In 1993, The Assessment Advisory Committee developed criteria for assessing general education. Eleven General Education Goals provide the foundation for USC's Criteria for the Assessment of General Education. Three approaches are used in assessing general education at USC: course-embedded assessment, self-report assessment, and general education examinations. Our experiences with these three approaches in assessing general education will be presented next.

Course-embedded Assessment of General Education

Thanks in part to a FIPSE grant in the early 1990s, 60 faculty members (FIPSE Scholars) attended three four-week summer workshops. This was a system-wide effort to re-conceptualize general education abilities. The Scholars developed modules that embed assessment and made it an integral part of teaching general education. FIPSE Scholars presented general education assessment workshops to other faculty members on all eight campuses. The course-embedded project was selected as an exemplary program and was published as Case 36 in Assessment in Practice: Putting Principle to Work on College Campuses, by Banta, Lund, Black, and Oblander, 1996. Many of the FIPSE Scholars still use the course-embedded assessment practices learned in the workshops.

Despite continued use of course-embedded assessment, interest in promoting the general education assessment modules diminished when the FIPSE project ended. As a result, little has been done to sustain this extraordinary initiative. In spring 2000, the Director of Assessment recommended to the Provost that a half-time faculty member of high status be assigned to chair the Assessment Advisory Committee and to champion the cause of assessment, especially course-embedded assessment of general education. The Provost approved the request in the summer of 2000. The new Chair of the Assessment Advisory Committee will be chosen in the summer of 2002. The general education assessment modules created during the FIPSE project have been placed on the Institutional Planning and Assessment web server ( and will be promoted by the new Chair of the Assessment Advisory Committee, and hopefully by FIPSE Scholars. The final report of the FOPSE project can be found at 63% of all undergraduate programs use course-embedded assessment to assess writing and/or oral communication. At the institutional level, course-embedded assessment is used to assess writing, oral communication, and proficiency in a foreign language.

General Education Self-Report Assessment

USC administers four institution-wide surveys on a routine basis:

*Cooperative Institutional Research Program (CIRP) - every year to freshmen;

*College Student Experience Questionnaire (CSEQ) - every three years to upper classmen;

*Senior Survey - a locally developed survey administered every three years to upperclassmen;

*Alumni survey - administered every other year to students who graduated three years before the administration year.

Many questions on the surveys deal with general education experiences. Starting in 1995, supplemental questions were added that specifically addressed general education. Most of these questions were asked every three years to determine trends over time A committee representing Student Affairs, the Career Center, Housing, and Institutional Planning and Assessment was formed to organize student self-report data. The committee chose which surveys to include, selected keywords and designed search functions. Institutional Planning and Assessment then created the dynamic web-based system; cataloged responses to every survey question by keyword, general education goal, and faculty committee; and hand-entered all data.

Anyone with a web browser has easy access to all survey data ( When a general education goal is selected, the most recent results for all questions regarding that goal are displayed. If a question has been asked more than one time, trend data in graphical form can be displayed to investigate trends over time. Unfortunately, only a select few browsers support the graphics used to display trend data. A staff member has taken a five-day training session to learn how to convert the graphics to a more mainstream format. Only quasi-longitudinal studies can be performed at present; value-added assessment, tracking changes in student responses over time, has been recommended by the committee.

Last spring the CIRP Follow-up survey was administered. This spring we are in the process of pilot-testing the Your First College Year (YFCY) survey and the National Survey of Student Engagement (NSSE). The committee will decide which one of the three follow-up surveys best suits the needs of the University. The selected survey will be added to the survey schedule for future years, and will be integrated into the Assessment Warehouse.

The current general education reporting system is artificially limited to the Columbia campus. Over 30% of our baccalaureate graduates started at another campus, and therefore received most of their general education elsewhere. Starting in 1999, a question was added to all surveys, asking where the student received most of his or her general education. A meeting with the committee and interested regional campus representatives will be held to determine the best manner to provide this rich source of general education data to each campus.

General Education Examinations

Previous Assessment Advisory Committee members recommended that the ACT College Outcomes Measures Program (COMP) be used to measure seven of the eleven general education goals. A sampling chart devised by the Committee was used to create the pool of examinees. It was difficult to get students to volunteer to take a three-hour exam.

Capstone courses were targeted for COMP administration. It would appear to be pedagogically sound to include a test of general education in the course. While the results are not representative of all USC students, the results are representative of students in the targeted program area. The students in the program are compared to a national norm, to other students in the program, and to other USC students who took the COMP. HRTA (Hospitality, Retail, and Sport Management), Journalism, Business Administration, and Engineering agreed to have the COMP administered to their students. Nursing has considered using the COMP because their accrediting agency endorses the COMP for assessing problem-solving skills. Engineering was willing to make the COMP a permanent part of their assessment. A self-report instrument that closely matched the COMP in subject matter was developed and pilot-tested. If the results of the instrument were highly correlated to the COMP results, the 23-question survey could be given in place of the three hour COMP. The results from the self-report instrument had a correlation to the COMP of -0.03.

When a sufficient number of students had taken the test, problems with the COMP started to appear. The test appeared to be culturally biased. Foreign engineering students with SAT scores of 1300 were performing quite poorly on the COMP, while American engineering students with SAT scores of 1300 were scoring very high on the COMP; The COMP seemed to be more of a general intelligence test rather than a test of general education. For example, engineering students were outscoring art students on Using the Arts. The COMP was difficult to administer. On the COMP, USC consistently scored below the average on Functioning in Social Institutions. While this is unfortunate, this is not one of our general education goals, and students do not appear to be exposed to this construct in the general education curriculum.

While progress was being made in getting programs with capstone courses to adopt the COMP, there were still only about 25 to 50 students a year taking the COMP. In 1999, ACT announced that it was going to discontinue the COMP.

The ETS Academic Profile, a 50-minute test, was pilot-tested in the spring of 2000. The ETS analysis of results is quite thorough and prescriptive. While students complained about the difficulty of the Academic Profile, they indicated in an exit interview that it was a reasonable test of what should be learned in college.

In Fall 2000, University 401 was added to the curriculum as a general capstone course. The Director of Assessment endorsed the course due to the great opportunity it presented for assessment of general education. The call for proposals for teaching the course specifically states that an assessment component provided by the Director of Assessment is expected to be part of the course. A general education exam will be administered to students in the class. The written report and oral presentation requirements of the course will be used for course-embedded assessment of these two general education goals.

While faculty appear to support the idea of University 401, students do not. The expected number of sections has been offered, but the size of each section is lower than expected. Spring 2001 was the first semester with sufficient numbers (50) of University 401 students for effective assessment. The director responsible for University 401 guarantees that four to six classes a semester will soon be standard, with at least 100 students per year enrolling in the course. Although little progress has been made in obtaining general education exam results, the foundation now exists for obtaining approximately 100 to 150 exam scores for graduating seniors each year, assuming the success of University 401. Preliminary results of two general education exams over the last few years indicate that USC is average in all areas of general education compared to similar institutions.

Conclusion

The University of South Carolina is committed to performing sound assessment of student learning and to using assessment results for improvement. The assessment has evolved from episodic, isolated assessment efforts to a system of assessment that adheres to the Nine Principles of Good Practice for Assessing Student Learning. As is evident from this report, the assessment system is routinely assessed and changes are made to the assessment system as dictated by the assessment results. Computer adaptive tests, online surveys, student-level electronic portfolios, and the support of course -embedded assessment through automatic archival of students works with the Blackboard system may all become part of the USC assessment system in the coming years.

Majors and Concentrations

Majors and concentrations provide students with specialized knowledge and skills. Primary responsibility for assessing the majors falls to academic departments and programs and external accrediting agencies, where applicable.

In 2001-2002, programs reviews in anthropology (BA, MA), exercise science (BS, MS, PhD), geography (BA, BS, MA, MS, PhD), history (BA, MA, PhD), international studies (BA, MA, PhD), philosophy (BA, MA, PhD), political science (BA, MA, PhD), psychology (BA, BS, MA, PhD), public administration (MPA), religious studies (BA, MA), sociology (BA, BS, MA, PhD), and sport and entertainment management (BS) were scheduled. Unfortunately, the South Carolina Commission on Higher Education (SCCHE) did not fund program review at the state level.

Anthropology, BA, MA.

In the past two years the introductory course in cultural anthropology (Anth. 102) has been taught successfully as a combination of lecture and discussion sections run by graduate assistantships. The archeology and biocultural faculty identified the need to develop a similar structure for large sections of Anth. 101. The goal is to add more "hands on" work with data on human evolution and prehistory, and to expand time for discussion in smaller formats.

The faculty is generally happy with the quality of training provided to our MA students, and with their success in the program and profession. MA students are active in attending and presenting papers at regional and national meetings. Students have won awards at Graduate Student Day competition, regularly received graduate school travel awards, and this past year one MA student received a Fulbright Fellowship to carry out research in Africa. However, the faculty identified several problem areas that can be addressed in student training and mentoring.

The graduate director has completed the administration of an exit interview. Students mentioned several areas where they thought their education and their experience could be improved. They want skill building (specific hands on experiences), including training in specific methods, analytical techniques, and IT skills and capacity. They would like more experience with grant and proposal writing, building a resume, and other aspects of professional development. They find the facilities and graduate stipends inadequate, and would like to see more opportunities to attend regional and national meetings. These students (and many over the years) would like the opportunity to study toward a PhD in the Department.

The Department of Anthropology is developing measures to address the needs specified in the assessment results. The department is committed to teaching a Research Design and Methods Class (Anth. 519) for students in our curriculum. The course will cover a range of methods and analytical techniques in social and cultural Anthropology. One of the features of the class is that it provides the context for completing a research proposal that can form the basis of a thesis project proposal. The proposal will replace a sit-down comprehensive exam for cultural students.

The Graduate program has restructured comprehensive exams to come at the end of the first year of study. Archeologists (and some Biocultural students) will continue to take a sit-down exam, although this has been streamlined into a three-part exam. Cultural (and some Biocultural) students will write an extensive research proposal in lieu of the sit-down exam. The instructor of record will evaluate the proposal for Anth. 519, the students' advisor and other designated faculty.

Over the past year the department has regularized a number of workshops provided to students. These include a "grant" workshop (including how to identify sources of external funding as well as proposal writing), a resume/vita development workshop, a workshop on preparing a literature review, and other specific "on demand" workshops on the use of specific methods or analytical techniques (e.g., Atlas TI text analysis software).

Over the past year faculty have been able to help out a little more on getting students to regional and national meetings. There is a minimal budget of $100 per student for travel. This amount is increased to $150 for students delivering papers, and $250 for students attending national meetings. Faculty have also been successful in helping students secure Graduate Student Travel Awards. Graduate stipends are embarrassingly small ($3000 for one-quarter time students assistantships and $4000 for one-half time assistantships per academic year). However, faculty have worked hard to place between one-third and one-half of our students in better paying assistantships across campus. These current departmental stipends represent an increase over previous years.

Exercise Science, BS, MS, PhD

Assessment methods/instruments include: transcript analysis, student course evaluations, exit questionnaires, and an annual alumni survey. Because very few classes are unique to a single program, the student course evaluation cannot be separated in individual academic programs. Similarly, most questions on the exit questionnaire are common across the programs within the department; in addition, the number of respondents in a single year would typically be too small for meaningful interpretation.

Across the three alumni survey questions concerning preparation for employment, the average rating was 3.04 on a 4-point scale for all exercise science graduates. There has not been a recent formal survey or interview of employer satisfaction, but anecdotally, employers are well pleased with the program's graduates. Among those accepted for the exercise science undergraduate program in fall 2001, the average SAT score was 1102. Among U.S. residents accepted into the program, 30% were minorities.