FINAL REPORT – 1/13 /11

SHORELINE COMMUNITY COLLEGE

MUSIC TECHNOLOGY PROGRAM REVIEW

Fall 2010

Prepared by Karen Demetre, Consultant

TABLE OF CONTENTS

PURPOSE...... 3

METHODOLOGY…………………………….. . 4

CONSULTANT REPORT

Findings on Program Review Elements

Assessment …………………………………….. 5

Program Information…………………………… 7

Student Data Trends…………………………… 9

Curriculum………………………………………. 27 Faculty……………………………………………. 30

Resources……………………………………….. 32

Schedule of Classes…………………………… 34

Partnerships…………………………………….. 35

Support Services……………………………… . 36

Revenue Potential……………………………… 37

The Virtual College…………………………….. 38

Competition………..……………………………. 39

Program Access………………………………... 40

Labor Market Opportunities………….………. 41

Analysis of Findings

Institutional Issues…………………………...... 42

Program Strengths…………………………...... 43

Recommendations…………………………...... 44

APPENDIX

Faculty Report…………………………………… 1

Student Survey Results…………………...... 20

Advisory Committee Survey Results………… 45

PURPOSE

The purpose of the program review process at Shoreline Community College is continuous quality improvement. This process is scheduled on a five year cycle across all instructional areas at the college.

This process serves to meet standards established by the State Board for Community and Technical College Education and the Northwest Commission on Colleges and Universities. Relevant accreditation standards are listed below:

4.A Assessment

4.A.1 The institution engages in ongoing systematic collection and analysis of

meaningful, assessable, and verifiable data – quantitative and/or qualitative, as appropriate to its indicators of achievement – as the basis for evaluating the accomplishment of its core theme objectives.

4.A.2 The institution engages in an effective system of evaluation of its

programs and services, wherever offered and however delivered, to evaluate achievement of clearly-identified program goals or intended outcomes. Faculty have a primary role in the evaluation of educational programs and services.

4.A.3 The institution documents, through an effective, regular, and

comprehensive system of assessment of student achievement, that students who complete its educational courses, programs, and degrees, wherever offered and however delivered, achieve identified course, program, and degree learning outcomes. Faculty with teaching responsibilities are responsible for evaluating student achievement of clearly-identified learning outcomes.

4.A.4 The institution evaluates holistically the alignment, correlation, and

integration of programs and services with respect to accomplishment of core theme objectives.

METHODOLOGY

First Committee Meeting

(orientation to process with full-time faculty, division dean, workforce dean, institutional researcher, and consultant)

Qualitative Information Collected

·  College website, planning guides, brochures

·  Master Course Outlines

·  Schedule of Classes

·  Class Cancellations and Wait Lists

·  Full-Time Faculty Input (written assignment)

·  Student Surveys (currently in program)

·  Advisory Committee Surveys

·  Full-Time Faculty Interview

·  Division Dean Interview

·  Advisory Committee Roster + Meeting Minutes

·  Program Review Reports (2000 + 2005)

·  2006-07 Music Tech Instructional Goals + Assessment Plan

Quantitative Information Collected

·  Faculty teaching loads (full-time and part-time)

·  Division budget figures

·  Annualized FTES, Headcount, and % of Enrollment

(by program and by certificate + degree)

·  Student demographics (age, gender, ethnicity, academic +

economic disadvantage)

·  Completion of degrees and certificates

·  Student grade distributions

·  State and college comparative data on S:F ratios

·  State employment data on former students

Final Committee Meeting

(discussion of preliminary report and faculty feedback)

Completion + Distribution of Final Program Review Report

CONSULTANT REPORT

Music Technology – Fall 2010

ELEMENTS REVIEWED, FINDINGS, + ANALYSIS

1. ASSESSMENT (FAC. REPORT Pg 1)

( faculty feedback, student survey, and advisory committee survey)

TOOLS TO ASSESS PROGRAM OUTCOMES

1.1  Program outcomes have been established for each degree and certificate option and are clearly stated on the program website. At the present time there is no formal system for tracking aggregate data on indicators/measures that demonstrate achievement of program outcomes. This is unfinished business from the 2005 program review report, but the student portfolio project has been a positive development since that time. Faculty monitors student performance in their classes, reviews capstone projects, and receives feedback from advisory committee members as well as current and former students to assess program outcomes. Follow-up with graduates consists of occasional conversations or contacts.

Faculty indicates that class learning outcomes support program outcomes; and therefore they consider students passing classes and completing portfolios or capstone projects as indicators that program outcomes have been achieved. The advisory committee reviews proposed curriculum changes, but a survey of members revealed a lack of clarity about published program outcomes. Faculty receives helpful student data from the new institutional researcher which also supports ongoing assessment of program outcomes. The faculty interview revealed interest in pursing a systematic approach to assessment of program outcomes by identifying measures/indicators and tracking aggregate data.

1.2  Published outcomes for the various degrees and certificate option do not mention skills in entrepreneurship and self-promotion. The advisory committee has emphasized that these skills are critical for future success of graduates working on a freelance or contract basis. Faculty shares this awareness and will need to revise program outcomes to reflect it.

1.3  A large sample of student respondents (73) gave a range of reactions about how well their individual learning needs were met. Almost two-thirds (62%) provided above average ratings and only a few (5) gave below average ratings. Student perceptions about preparation for employment were similar (60% gave above average ratings for the knowledge and skills they gained in the program and only two gave lower ratings). Although self-perception does not equate to actual measurement of student learning, it is an indication that the majority of current students are satisfied with the education provided by this program and are confident it gives them adequate preparation for working in the field. More feedback from former students would further validate these perceptions. Advisory committee members state that emphasis on music theory in the curriculum (along with technology and performance) better prepares graduates for employment success. This view is supported by anecdotal responses from employers and alumni.

1.4  Post-graduation surveys of students in digital audio engineering and electronic music/MIDI options show about 25% to 30% are employed one year after leaving the college. Faculty reports numerous types of positions and impressive examples of clients who employ graduates doing freelance work. Although the reported percentage of graduates working in the field seems modest, it is under-reported due to the difficulty of tracking many individuals who are self-employed in a variety of settings. Follow-up is particularly problematic for graduates who work in performance and merchandising fields, although anecdotal evidence is positive.

TOOLS TO ASSESS GENERAL EDUCATION OUTCOMES

1.5  Master course outlines identify general education outcomes addressed in each

course; however, specific guidelines/criteria or performance levels for assessing achievement of general education outcomes have not been defined by the college. Music Technology faculty are skilled at assessing student learning in their discipline; and they utilize a variety of assessment methods including many hands-on, authentic assessments such as repeat demonstrations, simulations, performances, portfolios, and capstone projects. Since many courses and assignments or projects include multiple learning outcomes it is sometimes difficult to isolate and collect assessment data on individual general education outcomes. Although it is assumed that passing grades demonstrate satisfactory achievement of general education outcomes, this area of assessment could be further refined as shown in the following chart:

General Education Outcomes

Learning Outcome / Assessment Measure / Data Collected / Evaluation of Data / Actions Taken
List here the measures the program uses to assess progress toward the outcome (GPAs, portfolios, student surveys, placement data, retention statistics, alumni surveys, etc.) / List here the specific data collected / Describe here what the data mean. / Describe the actions taken, based on the evaluation of the data
Quantitative Reasoning
Communication
Multicultural Understanding
Information Literacy
Gen. Intellectual
Abilities
Global Awareness

EVIDENCE OF ACTION BASED ON ASSESSMENT FINDINGS

1.6  Faculty continually evaluates student learning in their classes, reviews student feedback, and makes changes as appropriate. Authentic assessment of student abilities and job-related performance is prevalent throughout the curriculum and many opportunities are provided to apply knowledge and learn through experiential and “real life” experiences.

1.7  Student success is monitored by faculty and identified problems sometimes lead to recommendations for curriculum development. One example of curriculum change is the approval of MUSTC 106 (The Acoustics of Music) for the quantitative reasoning/math requirement. In the past CIS 105 (Computer Applications) was the designated course, but it was problematic for students. This would be a prime area for tracking student success in the future.

2. PROGRAM INFORMATION (FAC. REPORT Pg 3)

(Website, catalog, planning guides, program descriptions, brochures)

ACCURACY

2.1 Academic planning guides on the website are generally accurate and complete. One point of clarification is needed on the website description for Digital Audio Engineering, where Degree Prerequisites states “students without secure knowledge of music fundamental and keyboard ability should take Music 100 & 110 or Music 200 & 127 before taking Music 101.” Since Music 101 has been changed to MUSIC& 141, this requires updating. It is also confusing when the planning guide states Music 100 & 120 (versus 110) should be taken before MUSC& 141.

2.2 The website states book costs for every degree and certificate are “variable and approximately $200 per quarter”. Since curricula vary greatly among degrees and certificates, individualized estimates would be more helpful to students

RELEVANCY

2.3 Current students gave a range of ratings on helpfulness of program information (website and printed materials). More than half of the current students surveyed (57%) rated it as good or excellent, while 33% rated it as fair; and 4% indicated “not so good”.

2.4 Program descriptions on the website and brochure provide helpful information about employment opportunities associated with each degree and certificate.

Due to the changing nature of the field, greater emphasis may be needed to point out that employee positions are competitive and limited, but there are many opportunities to be successfully self-employed or work on a freelance basis for graduates with strong entrepreneurial and business skills.

2.5 CIS 105 is still listed as the quantitative reasoning requirement on academic planning guides and this will need updating to reflect approval of MUSTC 106 for the requirement (along with its prerequisite of Math 080 or an acceptable score on the Algebra COMPASS test). The program coordinator is aware of this and anticipates curriculum committee approval for revised planning guides.

2.6 Since the majority of music technology courses are offered only once per year (and many are part of a three course sequence) the students’ academic planning would be improved by highlighting this fact. A consistent symbol used on each planning guide could be used to identify courses that are offered once per year. This would better support students’ academic planning.

CURRENCY

2.7  Program information on the website, brochure, and planning guides has been

recently updated (summer 2010). Periodic updating is managed by the public

information office as well as the music technology faculty and technician.

2.8  Student surveys indicate some updating may be required for the SCC

Recording Studio website (i.e. equipment available at different workstations,

upcoming events and concert information).

CONGRUENCE

2.9 The website provides a consistent presentation of headings for each program option (i.e. quarterly costs, program description, etc.). Each academic planning guide presents sample class schedules, which help students with academic planning. Courses are tagged as general education or related instruction for communication, computation, and human relations in all degree options. This approach differentiates technical courses from transfer courses (or courses that apply general education concepts to the field); and it clearly documents that accreditation standards are met.

2.10 Content is generally consistent between the website, program brochure,

and hard copies of academic planning guides.

ACCESSIBILITY

2.11  Most program information is accessed through the internet, which attracts

potential and current students from diverse populations and many locations.

2.12  The Music Technology Program is not identified separately in the website A-Z

index. It is necessary to go through the Music Department listing or the link for Professional-Technical Programs to find the Music Technology Program and the SCC Recording Studio website. Student surveys reveal that the college website is difficult to navigate.

2.13  A list of music electives offered is not easy to locate without searching through

the online catalog. This is especially true since elective courses for MIDI and Merchandising are not explicit in the academic planning guides.

2.14  Limited copies of the printed college catalog are available, but the website

provides access to the same information. Other printed materials include program brochures and academic planning guides available in the Division office and Advising Center.

3. STUDENT DATA TRENDS (FAC. REPORT Pg 4)

NOTE: Issues affecting accuracy of students’ program intent codes may

impact some institutional data used in this review.

THREE YEAR ENROLLMENT – ANNUALIZED STATE FTES

3.1  Annualized state-funded FTES for the Music Technology Program has declined

somewhat over the last three academic years to 190 AnFTES for 2009-10. (Note: one annualized full-time equivalent student = 45 credits/year) Modest declines in all degree options are apparent for the three year period; however, the program remains one of the largest prof-tech programs at the college.

3.2  Factors negatively impacting enrollment include increased tuition and fees,

limited space and equipment availability, and budget constraints that make it difficult to add class sections.

Year / Dig/Audio / Merchandising / MIDI / Performance / Dig/Perf / Grand Total
A78 / 148.08 / 9.78 / 41.27 / 28.44 / 227.57
A89 / 141.72 / 7.93 / 36.11 / 23.20 / 0.31 / 209.28
A90 / 131.44 / 7.89 / 28.53 / 21.87 / 0.31 / 190.04

THREE YEAR ENROLLMENT: STUDENT HEADCOUNT and

PERCENTAGE OF PROGRAM ENROLLMENT

3.3  A three-year comparison of annual student headcount reveals fewer students in

all areas during 2008-09 followed by increased numbers of students in 2009-10 for Digital Audio , Merchandising, and Performance options. The only area experiencing a three year decline in student headcount is the MIDI option.

3.4  Although annualized FTES for the last three years have declined for all degree

options, student headcount has increased in all areas except MIDI. It appears that more students are enrolling but completing fewer credits. Comparing fall quarters, the percentage of part-time students increased between 2008-09 and 2009-10 (from 27% to 30%). This trend may relate to increased costs, more economically disadvantaged students, and financial aid issues.