Columbia University

The Fu Foundation School of Engineering and Applied Science

Outcome Assessment Program

Introduction

The purpose of our outcome assessment program is to assure that the educational process is fulfilling its promise to students -- to engage them in a stimulating, experiential learning process that prepares them fully to take their place in the job market and develop successful professional careers. The focus of the assessment program is on student learning and on how the program can help the student to learn more effectively. Although assessment may be focused on classroom activities, it can be implemented at different levels (course, department, school-wide), and it reaches a full potential when it is fully institutionalized around a set of clearly defined institutional, program, and course objectives and outcomes. When assessment serves the goal of institutional strategic planning, it becomes an effective Continuous Quality Improvement strategy that contributes to the achievement of the institutional vision and mission.

Prior to 1997, as part of our strategic plan, we implemented a course evaluation process for the entire School involving the evaluation of all undergraduate and graduate courses and faculty. Over the past three years, we have continuously improved our course evaluation process as well as added several other assessment processes. NSF Gateway support and expertise have been critical in the development and institutionalization of an increasingly comprehensive Assessment Program at Columbia’s School of Engineering and Applied Science. Gateway’s vision and implementation strategies established the impetus and provided continued support for a process that at SEAS developed gradually in several steps:

· Collaboration with Teachers College to evaluate new, innovative education programs

· Evolution of our course evaluation system to the Web (WCES)

· Implementation of department planning

· Institution of multiple assessment processes to solicit feedback from students and graduates regarding program objectives and outcomes.

Collaboration with Teachers College

During the early phases of our involvement with Gateway, we established a working relationship with faculty and graduate students at Teachers College, Columbia University. Early collaborations included the assessment of new freshman courses developed by senior faculty. Several engineering courses were evaluated using a new evaluation process designed in association with the Institute for Learning Technologies (ILT) at Teachers College. Both students and faculty completed surveys at the end of each course. The surveys focused on course objectives, student learning outcomes, and use of various instructional strategies and technologies. An extensive report from the ILT staff, The Gateway Engineering Education Coalition: 1997-98 Evaluation Report: Phase 2, describes and discusses the results obtained. While the faculty made several changes to their courses based on the results, the School decided that it was important to enhance the existing course evaluation system.


Implementation of Department Assessment Planning (Three Strategies)

We identified an integrated set of strategies to support our assessment objectives. The three strategies included: 1) initiate a structured process to involve faculty and staff in the ongoing planning, development, and monitoring of the program; 2) offer "just-in-time" educational sessions to develop faculty knowledge and skills in assessment; and 3) create an assessment toolbox providing administrators and faculty with templates that can be used in and outside the classroom. These strategies work together as a system to support the development and implementation of assessment into the university environment. Each strategy supports and complements the other.

The first strategy involves the implementation of a structured, facilitated process to help department chairs and faculty develop program objectives and learning outcomes. This strategy comprises some of the most important and challenging activities in the overall process. We have proactively championed a formal planning process in each of our departments. The first step of this process is the definition of objectives, strategies, and measurable outcomes at the departmental and course-level. In each department, a small team was formed to work with administrators, faculty, and external constituents to define departmental and course-level objectives, strategies and measurable outcomes. The Departmental level focuses on the learning objectives, strategies, and outcomes of academic programs and the effect these have on graduates as a result of the curriculum offered. Course-level objectives, strategies, and outcomes help to define what learning outcomes are expected as a result of a specific course. In order to support these formal-planning efforts, we developed workbooks for departmental and course-level planning. All participating staff and faculty received copies of these workbooks. A sample copy is provided in the Appendix.

Our second strategy involves providing just-in-time workshops for faculty and staff to support assessment activities. We have found that in order to inculcate the structured process, frequent seminars and workshops must be delivered to staff and faculty. These developmental initiatives help to enroll everyone into the process through understanding and constructive debate. Later on, professional development activities focus on the skills required in the operation of a comprehensive assessment program. These skills range from developing valid assessment methods to providing outcome performance feedback to students in the classroom. During the past year, we have designed and conducted several workshops at the department chair level as well as facilitated a number of small sessions with faculty. We designed several workshops to support faculty learning about assessment and continuous improvement. The purpose of these sessions is to provide faculty and administrators with an understanding of outcome-driven assessment and what is involved in establishing an assessment program in an educational environment. Special attention is given to ABET Engineering Criteria 2000 and the role that assessment plays in meeting the new requirements.

Our third strategy was initiated concurrently with the first two. As each of our departments establishes educational objectives and outcomes for their programs, there is a need to create a series of assessment tools to measure the stated outcomes. We have found that, in the early stages, this can be perceived as a daunting task. The reason for this is that there are usually many outcomes identified, thus seemingly requiring as many, if not more, methods by which to measure the anticipated changes. One of our key assessment strategies is to look for common themes, and needs across our departments, and create what we call an assessment “toolbox”. This toolbox contains a number of templates that faculty, students, and staff can select from to support the measurement of identified outcomes. Additionally, to support departments, we have focused on a common set of assessment methods, most of which have been designed to measure student learning outcomes. We have designed assessment templates that offer our departments the opportunity to measure general learning outcomes across the curricula while providing faculty with the flexibility to customize these instruments for their specific courses.

Evolution of the SEAS Course Evaluation System

As stated earlier, we began our early assessment efforts by implementing a formal course evaluation process throughout the School. The evaluation system was initiated around 1995. The initial purpose of the program was to solicit feedback from students as to the quality of courses being taught as well as the effectiveness of teaching throughout the School. The Dean and Vice Dean, from the start, have effectively used this feedback to monitor and continuously improve the quality of teaching. Today, course evaluation results are an integral part of faculty promotion decisions as well as the re-hiring of adjunct professors. Moreover, most instructors are using the assessment results to change the format and content of their courses. Some instructors have experimented with experiential approaches to teaching and learning, moving away from the traditional lecture format; others have become interested in knowing more about their students’ learning habits and styles; still others have become interested in assessment itself as a way to measure the impact of their new pedagogical techniques.

Over the past two years, Columbia has worked with student groups and faculty to develop a comprehensive electronic course evaluation system. The genesis of the system began with a student project, funded by NSF Gateway, that allowed faculty evaluation data to be uploaded to a website for student review. The website allowed students to review evaluation data by course or professor. This information is used to help guide students course selection courses. The Oracle Website has been in operation for two years. As a second step, another student group worked on a class project to develop a prototype web-based course evaluation system that would seamlessly link to the Oracle Website. The course evaluation system is designed to allow faculty to customized course evaluation surveys reflecting relevant learning outcomes for any specific course. Administratively, the system is linked to the Registrar’s Office to ensure that all course information matches university records.

In looking ahead, we are in the early stages of creating a comprehensive outcome-based assessment system, deliverable over the web. Systems applications to be developed include course evaluation, longitudinal tracking, and surveys for students, faculty, alumni, and employers. The completed assessment system will be fully functional, documented and ready for distribution to the engineering education community and beyond. The majority of the next year’s activity will focus on the development of the systems’ infrastructure and overall design. The course evaluation system will be revised based on pilot tests conducted in the previous year. The outcome of this phase of work will be a fully flexible course evaluation system that can readily interface with the University Registrar’s legacy system. The course evaluation system will include basic survey development, data collection and database management capabilities, and multi-level report generation. Additionally, a course syllabus module will be integrated into the system to allow for direct translation of course objectives and learning outcomes into survey protocols.

Other major activities will involve the design and development of prototype modules for web-based student, faculty, alumni, and employer surveys. The idea is that each of these modules will be integrated at multi-levels including student, department, and university. All data will be collected in a relational database, allowing for longitudinal tracking and various types of institutional analyses. Each module will work similarly to the course evaluation system. Systems users will be able to design basic surveys, deliver to potential respondents over the web, and electronically collect, analyze, and report on results.

Institution of Multiple Assessment Processes

Over the past several years, we have developed and implemented a number of formal, systemic assessment processes to examine our educational programs as illustrated in Figure 1. These assessment programs include alumni surveys, senior exit surveys, web-enabled course evaluation system, and two external processes - Dean’s Engineering Council and Academic Review Process. These various assessment processes provide the departments and the Dean’s office with input as to the effectiveness of our educational programs. Results from all of these processes are provided to department chairs, faculty, and Dean’s office staff. Here is a brief summary of each process. Related assessment materials are attached.

Figure 1.

Alumni Surveys. Beginning in 1998, we instituted an on-going process to gather feedback from our Alumni. Our 1998 Alumni Survey was distributed to over 8000 alumni worldwide. The survey was designed to elicit information regarding their undergraduate experience, on-going education, and employment and career path. Of particular importance, the survey provided our departments and faculty with preliminary information regarding specific student learning outcomes.

As part of our on-going assessment process, we administered our second Alumni Survey in 1999. For this administration, we sent short-form surveys to two graduating classes – 1989 and 1994. In this case, the survey was designed to focus on the EC2000 student learning outcomes. Alumni from these graduating classes were asked to rate the quality of the preparation in both professional and technical learning outcome areas.

Senior Exit Survey. While Columbia University has always conducted University-wide senior exit surveys, this year, SEAS instituted its own survey process to gather more detailed information about our own graduates. For 2000, we administered the EBI Engineering Student Survey. This survey, developed by an outside resource, is highly consistent with ABET Criteria 2000 standards. This summer we received the results of the survey that included benchmarked data with several peer engineering schools. Our departments are currently reviewing the data for required actions.

WCES Student Course Evaluation System and Oracle Web Site. For several years, SEAS has had a formal course evaluation program. This year, we piloted a web-enabled version. There are several reasons for this, including the ability for faculty to customize select portions of the survey to focus on major student learning outcomes relevant to the course in question. The Spring 2000 pilot test was successful, with close to a 75% response rate by the students. During the next year, we will be making major modifications to the system to support faculty and student use. Another unique feature of the program is the Oracle Web Site that provides timely course evaluation results to both faculty and students. Results are made public for faculty, students, and administration to review. Course evaluation results are taken very seriously in the School. Faculty use results to improve course content and the quality of their instruction. Students look up course data as part of their course selection decisions. Finally, administration uses these results to support pay and promotion decisions.

SEAS Academic Review. The Academic Review is an ongoing process where each year, one or two programs bring in a small group of external evaluators to review the quality of the program offered. In 1999-2000 academic year, two of our departments, Mechanical and Electrical, conducted these reviews. The primary objectives of the SEAS Academic Review are to assess the quality and effectiveness of all academic and administrative departments, foster planning and improvement, and provide guidance for future decisions. The process is designed to provide an opportunity for critical self-review and for re-evaluating long-range plans, assumptions, and goals. Additionally, the process allows inter- and intra-departmental iteration and dialogue to help identify strengths, challenges, and future opportunities. The SEAS Academic Review is designed to help maintain academic and administrative excellence and to formulate strategies for continuous improvement.

Dean’s Engineering Council. Chartered in 1955, the Engineering Council is made up of influential alumni and donors of the School. Nominees are selected for their leadership skills in industry or education, elected by a majority of the Council, and then ratified by the University Trustees. Current members include high-ranking executives from AT&T, American Express, Citibank, Lucent Technologies, the former president of Telcordia, and the former provost of MIT. Each spring, the Council meets for a full day plenary session. After a report on the state of the School from the Dean and elections of new members, the Council divides into small groups that meet with undergraduates, graduate students, and faculty members of each department. At the end of the day, the Council returns to report their findings to the Dean. Comments and suggestions from these meetings have led to increased efforts to improve student life, upgrades in laboratory facilities throughout the School, and the establishment of new graduate fellowships and awards for teaching assistants.