/ Program Assessment Committee

Minutes

Tuesday, September18, 2007
11:00a.m. – 12:30 p.m.
Markstein Hall 421

MEMBERS PRESENT:G. Sonntag, T Shore, D. Formo, M. Fitzpatrick, C. Schuster, L. Stowell, D. Barsky

STAFF PRESENT: S. O’Connor(came in late)

  1. CALL TO ORDER

a)Approve Minutes
Minutes for 9/4/07 meeting were approved

  1. DISCUSS PROGRAM REVIEW GUIDELINES (included in member packets)

a)PEP Document and Interim Program Review Guidelines

  • Old PEP and new PEP: Gabriela looked at how the two documents might be meshed together.
  • In the old PEP, faculty were asked to look at SLO (Tab 8 appendix D) but it was not the focus in the old reports.
  • So how do we make the SLOs part the major part with the self study?
  • New PEP should focus on SLOs. Only talk about one or two other items (support) as needed. In the new PEP – support items (student readiness, support, advising, design of the degree) etc. are chosen every five years.
  • Suggestion: revise a-e questions under student learning outcomes. We could also ask them to write a summary of their five annual reports and look at what the chancellor’s office wants (tab 5). Our annual assessment report basically asks the same thing.
  • Will the same people review it if say library is not one of the goals?
    David and Gabriela response: Even if there isn’t a need for a direct comment on the goals of a department related to library sources – it allows the library to see the direction a department or program is going and it allows the library to see how well they are supporting a program. Different constituents on campus do like to see the documents (like Russ Decker) to see what programs are doing and the program may not have thought about or made mention of those aspects that impact all readers of a document.
  • Suggestion: Do a-e under student learning outcomes and pick two support goals (advising, extracurricular, resources, pedagogy, etc.) to write about.
  • Suggestion: New document should ask them to specifically address recommendations from previous review.
  • Data notebook: No guidance to department or reviewer about what it is for. It needs to be explained better. You need the numbers to do your analysis – so reviewer can appreciate your analysis. We need to include hints like: “you might want to refer to these in your introduction” or “Refer to this data to support your needs”. There also needs to be hints for the reviewer.
  • Calendar of events has been replaced.

b)The 5-year cycle and the annual reports

  • We say we have 2 year reviews and 5 year reviews
    5 year plans (to Chancellor’s office)
    Annual assessment plan
    Actual program review
  • What is happening in the intervening 3 years? When does the department have the time to think about and implement changes?
  • We looked at visual in tab 10 – 5 year annual reports are simultaneously happening when this is happening too. Depending on the scope of a project it may go longer than 5 years.
  • David: Every year departments should be getting feedback on these annual reports so don’t wait until year 5 and external reviewer comes in and sends them off in a different direction.
  • The annual assessment is only SLO’s.
  • 3 departments now offer more than one degree and more will have more than one degree. Presents a workload challenge.
  • Another challenge: the implementation of a research project
  1. WASC PROPOSAL LANGUAGE
    Did not discuss
  1. PROCESS FOR REVIEWING AND PROVIDING FEEDBACK ON ANNUAL ASSESSMENT REPORTS
  • Need to resolve the relationship between annual review plan and report and 5 year annual review.
  • We also have to address the outcomes for theme 2 in WASC Proposal.
  • David gave us a little history of these outcomes – old Educational Effectiveness committee and the analytic studies office.
  • There is still faculty confusion about student learning outcomes and what is the driving force (aside from WASC). Many faculty feel like it is being dropped on us.
  • It is in the old PEP process. What is the level of clarity about this on campus – how do you write them, etc.
  • In 1999 Victor Rocha sent out to all departments that we all need SLO’s.
  • How can we work to get the campus to talk through and embrace this idea as well as mentor them through it?
  1. ASSESSMENT WEBSITES
  • PAC review of annual reports. - Develop a common set of questions:

-Does the program have SLOs?

-Does the program define 3-5 outcomes appropriate for assessment

-Are the outcomes measurable

-Do the outcomes speak to what students will be able to think, know or do

-Do the assessment measure intended outcomes

-Are there multiple means for outcomes

-Using assessment results.

  • The author’s of the reports should have these questions
    Gabriela will send us this list.
  • David talked to Vicki Golich – she received some reports – what does PAC want to see – do we want to see Vicki’s comments? (yes)
  • David has a catalogue copy deadline – do we want to ask David to ask Dean’s to send us what they have and any comments they are making back to the departments?
  • PAC will look at in terms of

-funding:

-Where the campus is in terms of ablilty to write SLOs and assessment – what do their plans look like?

-Use the short list of questions (above)

  1. TEMPLATE FROM JFKUNIVERSITY (emailed)

Format of the materials that Gabriela sent us – JohnF.KennedyUniversity are very clear guidelines – we could use these as a model. It is very user friendly.

  1. PROGRAM REVIEWS

a)Status of programs
Econ- nearly complete- waiting for additional reader comments and department response
PSYCH- PAC response
VPA- contacted several candidates for external reviewer, site visit will hopefully occur in October

b)Psychology
Will reviewat next meeting

VIII.TASKS FOR NEXT MEETING

  1. Susan O’Connor to look at job descriptions of a faculty fellow for assessment on other campuses so she can submit a draft of one to the provost. The committee feels that we can not ask departments and programs to do this work without support and resources
  2. Camille will start a draft letter for psychology