ANNUAL PROGRAM ASSESSMENT REPORT – 2009-2010

1. Please summarize your department or program(s)’ assessment activities during the 2009‐2010academic year.

Biology and Chemistry: Our ad hoc Biology Major Assessment Committee analyzed the results of an online surveyconducted late in AY2009 that sought feedback from our Department faculty colleagues in biology (N=9)regarding their preferences for concepts and skills in biology to be assessed for outcomes in our corecourses. For the survey, we listed 4 outcomes for each of 30 conceptual topic areas broadlyrepresenting modern biology (total = 120 concepts) and as many outcomes as we could identify in 9areas of problem solving and practical skills (total = 41 skills). We then asked faculty to indicate on aLikert scale (1 = strongly disagree, 5 = strongly agree) whether the concept or skill in question was“essential for biology faculty to assess in our students”. In analyzing the results, we decided to focus onoutcomes that met either of 2 criteria: that (i) 100% of the faculty strongly agreed (= 5) or agreed (= 4)that the concept or skill was essential to assess, or (ii) at least 50% of the faculty strongly agreed (= 5)that it was essential to assess. This selection process yielded 15 concepts, 8 problem solving skills, and 5practical skillsthat were judged assessable with general agreement by our faculty. These were thenlinked to the 6 specific courses in our core curriculum, to guide implementation of outcomes assessmentfor the selected concepts and skills in these courses. We next conducted prospective assessments of a subset of these concepts and skills in GeneralBiology II in the Spring semester, and retrospective analyses of assessments of a largely different subsetof concepts and skills conducted in General Biology I in the Fall semester. We also began to revisit somechoices of assessable outcomes, e.g., to wonder whether photosynthesis should be assessed in GeneralBiology I, even though it was not a preferred assessable outcome by our criteria for the faculty survey.

Business Administration: Department assessment activities during the 2009-2010 academic year included:

  • Evaluating and updating core business course learning outcomes to ensure agreement on outcomes and that they clearly stated what students are expected to do once they complete these courses. The core course learning outcomes were also reformatted to make sure they are assessable. They clearly state the skills, concepts, and capabilities that students will develop in the courses.
  • Finalizing development of overall Business Administration Department learning outcomes for the following areas: management, marketing, financial reporting, managerial accounting. Added new category and created learning outcomes for computer information systems.
  • For all business department learning outcomes, determining specific (1) business courses where skills, concepts, and capabilities taught, (2) data (e.g. projects, presentations, papers, case studies, exams, etc.) within each of these courses that could demonstrate the extent to which outcomes were met, and (3) threshold at which students would be considered successful in meeting outcomes
  • Publishing departmental learning outcomes along with business courses where skills, concepts, and capabilities taught, data that demonstrate the extent to which outcomes are met, and success threshold to Fitchburg State College Business Department Web site: under accreditation link.
  • At the start of the Fall 2009 semester, beginning to assess Business Department student literacy from the time students begin the business department introductory computer information systems course through course completion.
  • Developing a process to test and assess pre-/post class literacy, which includes the following steps: (1) define computer literacy, (2) determine components to be included pre-/post-tests, (3) map components to course learning outcomes, (4) determine how literacy and learning will be assessed, tracked, scored, and analyzed, (5) develop assessment tracking tools, (6) implementation (testing, instruction, learning), (7) analyze results, (8) course modification and process fine tuning.
  • Presenting results of computer literacy assessment collected during Fall 2009 semester to Fitchburg State College faculty during Looking for the Body of Evidence: a Data Source Colloquium Third Annual Winter Assessment Day, January 19, 2010. This process is on-going, and business student computer literacy pre-/post-testing results are currently being collected for the Spring 2010 semester.
  • Conducting the equivalent of an ETS “major field test,” to be expanded upon next year
  • Sending Business faculty representatives to the NEean (New England Educational Assessment Network) Fall Forum 2009. The Forum focus was Integrating Classroom, Program and Institutional Assessment.

Communications Media: This year we continued the practice of conducting portfolio defenses of all students who participated in the internship program. The internship is the last course the student takes in the curriculum and it is a requirement for graduation. While at the internship site, students work 37.5 hours per week at a job that is related to their concentration of study. At the completion of the internship the internship site supervisor completes an assessment of the student’s contribution and preparedness as a professional. In addition to internship, students can submit work to Visions which is a juried exhibition and presentation of the best student work. This includes papers, graphic design, interactive websites, videos and films, and photography. We also studied course loading and sequencing in response to information we received from the Graduating Student Survey.

Computer Science: Over the last year, we have collected data about our CSSO and CISSO outcomes through component grades taken from course work. We have improved our coverage of outcomes by assigned assessments in a variety of formats (including tests, essays, projects, and surveys) to them. We developed a standard reporting form for faculty to use which includes reflection on assessment results and actions taken as a result of the assessments.

Criminal Justice: In the fall we used Tk20 to conduct assessment. We used our rubric to assess papers from the Senior Colloquium and from Data Analysis. After spring grades are completed, we will do the same with spring papers. We generated and tabulated a large amount of data about program completers in order to meet the requirements of the state Board of Higher Education.

Economics: The central focus of the economics program in the 2009-2010 academic year was to make smooth and evaluate the transition in the international business and economics concentration whereby new entrants into the program would be required to major in economics and minor in business administration for the first time. While the revamping of the concentration held out the possibility of a major, even dramatic increase in the number of economics majors, there was also the concern that many students might be dissuaded from the concentration because of the greater emphasis upon economics. Hence, the three faculty members teaching economics courses engaged in a continuing discussion of the efficacy of the transition. This entailed keeping track of the number of students in the concentration; making it a priority in the advising of students, both those ‘grandfathered’ in the business major and those new to the concentration; and encouraging more active engagement of students in considering the merits of majoring in economics.

Education: (A) We conducted all our assessments as prescribed in the Education Unit Assessment System. (B) We analyzed the data from the 2008-2009 instruments, discussed the results and developed strategies to address areas of concern. (C) We met with three different constituent groups (teacher candidates, supervising practitioners, and principals and superintendents) to obtain feedback about aspects of our programs:

English: During the 2009-2010 Academic Year, the English Department’s Assessment Committee completed its department-wide assessment of all senior portfolios received in the 2008-2009 AY (Winter and Spring graduates). We assessed all 27 of the portfolios with particular attention to one of our department-wide objectives—the ability to distinguish between and analyze literary genres—and continued to use our three-point scale to grade students’ abilities.

Exercise and Sports Science: The majority of the EXSS department’s assessment efforts over the past academic year have been in reorganizing our approach. While we had made great strides with assessment in the first couple of years, we were running into various issues of practicality and were also concerned that we were making the assessments unnecessarily cumbersome. The NEean conference in November offered sessions on curriculum mapping the opportunity to speak with the presenters and obtain some additional guidance and materials. Curriculum mapping seemed an important step in the process that we had been missing. The idea is that we check all of our courses to see which cover our various program goals and also the level of knowledge/competency we would expect at each course. As part of the curriculum map, we defined three levels of knowledge/competency: basic knowledge and skills (B), working knowledge and skills (W), and demonstrated competence (C). This way we can track whether the curriculum is doing a good job introducing the program goals in the introductory level courses and later emphasizing and reinforcing them in the upper level courses.

Geo/Physical Sciences: Our assessment activity this year was to do a ground-up review of student learning outcomes which reflect both the most current thinking about content and methods and the unique configuration of our department. We not only completed this task but got outcomes approved and published on the departmental web site.

History: Our program has had exit assessments in place for several years. We draw data from using the HIST 4500 Research Paper Rubric (which corresponds to program objectives) and the HIST 4500 Exit Survey. Our intended student benchmark is that at least 85% of students completing the HIST 4500 Senior Seminar research paper will perform at an acceptable (3) or exemplary (4) level on each of the nine outcomes included in the research paper rubric and that at least 85% of respondents on the HIST 4500 Exit Survey will respond with “Strongly Agree” (4) or “Agree” (3) to General Impression About the History Major and The Goals of the History Major.

Human Services: Human Services collects assessment data according to the requirements of our program accreditor, the Council for Standards in Human Service Education.

Industrial Technology: This year has been somewhat chaotic and unfortunately Assessment didn’t receive the attention we had planned. We had to adjust to an unplanned retirement that took place in January. The department has also been investigating accreditation agencies for future application. The differences between the agencies being considered have caused some delay in determining the standards we wish to adopt for the department and the individual concentrations. However the department has made some progress this year. Our department developed and implemented an advisory committee structure that included individual advisory committees for each concentration, and an advisory board for the department.

LeadershipAcademy Honors Program: The LeadershipAcademy conducted an assessment of our students’ Thesis projects. These projects were assessed on the following standards: quality of sources, quality of research, quality of written communication, quality of oral communication, initiative, and creativity.

Mathematics: we set ourselves two goals for the 09/10 academic year: collecting and assessing student work that is aligned with the department's technology goal, as well as continuing the process of developing new rubrics and skills lists (assessment tools needed for the implementation of the Department's assessment plan). (1) Assessing the Technology Goal: Over the year we have collected and assessed almost 40 pieces of student work using the rubric we developed in AY08/09. Most of this work was associated with lower division courses (2000 level) and establishes a good baseline on how our students are using technology early in their Mathematical career. Furthermore, some members of the committee were awarded mini-grants to present to the campus as a whole the results of this year's assessment at the Spring (and now Fall 2010) Assessment Day. Toward the goal of introducing the students to the technology we will use to collect, assess, and store data, we were successful in creating handouts that walked the students through the process of submitting assignments in TK20 so that students could work with the software even if the faculty member teaching the course was not using the software. Our response rate from students was 76% (the complementary non-response rate of 24% includes those students who did not submit as well as those who did not complete the assignment). (2) Creating new assessment toolsOver the year the committee developed a rubric for student presentations, both oral presentations as well as poster or PowerPoint (see attached). This rubric has been approved by the committee and is now available in Tk20. We also began development on skills checklists for Calculus 1, Calculus 2, and Abstract Algebra, however we are still considering an effective way to implement these checklists as assessment tools.

Nursing: Nursing continues to collect a great deal of assessment data via surveys administered in Tk20. Other information about student performance is collected in accordance with the guidance of the external accreditor. The Tk20 program coordinator, Carla McGrath, attended the Tk20 training in Austin and has provided information about additional capabilities of the program which may be implemented in the future. We ran pilot tests of an e-portfolio for RN-to-BSN students and of a writing portfolio.

Political Science: The main assessment vehicle used by the Political Science program is a reflective portfolio conducted as part of our senior seminar course. As all majors must complete this requirement prior to graduation and as a senior seminar it comes at the end of the student’s undergraduate career, the portfolio allows students to reflect on their careers. The Political Science faculty have set four criteria that all graduates should be able to demonstrate mastery of. In the portfolio, students must reflect on the work they have done and provide evidence of meeting these skills. These criteria are as follows: (1) Political Reasoning. (2) Political Science Knowledge. (3) Political Science Methodology. (4) Political Theory. In September of each year, we send a letter to all majors reminding them of the senior seminar requirements so they may plan ahead for what is to come. Ann Hogan, our TK20 contact on campus visited the senior seminar group to provide detailed instruction as to how to make use of this platform for their portfolios. Following the completion of the senior seminar course this year, the four Political Scientists each individually reviewed the portfolios and scored each student’s portfolio according to the rubrics we have established. We then met three times to discuss what we learned from the process and to plan to address issues that arose.

Psychology: The psychology faculty met in the summer of 2009 for a day-long meeting on assessment. At this meeting we identified our assessment objectives as well as several methods we could use to assess these objectives. These assessment methods included standardized tests and evaluations of work products.

Sociology: Assessment activities by members of the Sociology program during the 2009-2010 academic year include: (1) Professor Augustine Aryee presented on the topic of assessment during FSC’s Fall Assessment Day. (2) Professor Mazard Wallace travelled to Austin, TX to attend a TK20 Assessment Conference; and (3) members of the Sociology program continue to utilize TK20 assessment software to generate data on student performance.

2. What is the most important thing you learned from assessment in 2009‐10, and how does knowingit benefit your program(s)?

Biology and Chemistry: Specific results and summary scores for AY 2009‐2010 courses will not be available until after the Spring 2010 semester. However, in general terms, we came to recognize that our standard approachto evaluating student learning with lecture and laboratory exams and quizzes in General Biology I and IIis a rich source of data on outcomes. This includes demonstrating conceptual understanding, criticalthinking, and practical knowledge. We therefore are encouraged to believe that the same will be true ofmore advanced courses in our core curriculum.

Business Administration: Benefits of assessment activities included:

  • Taking great opportunities to evaluate both core course and departmental learning outcomes. During analysis learned that a number of the learning outcomes needed refinement in order to better clearly state student expectations and to make the learning outcomes easier to assess. Clearly defined, actionable learning outcomes allow for communication of clearly expressed expectations to students taking core courses and completing their business degrees.
  • Taking opportunities to review each business core course to determine where business department learning outcomes can be assessed, as well as the data that can be used to demonstrate the extent to which outcomes are met, and success thresholds. During review learned it was necessary to make some modification to student deliverables in order to fully assess the appropriate course learning outcomes to benefit student learning. Also, general agreement was made on success thresholds.
  • After analysis of Fall 2009 computer literacy pre-/post-testing results, making modifications to teaching methods for a few specific concepts that were not well comprehended by students. By conducting this type of assessment and making necessary adjustments per the results, the Business Administration Department can ultimately improve the manner in which undergraduates are taught the skills, concepts, and capabilities that comprise the defined learning outcomes.

Communications Media: Portfolio presentations were generally of a high caliber. Intern evaluations indicate that for the most part, students in all concentrations can transfer their skills to the workplace. As far as loading and sequencing goes, we have taken some steps to change advising.

Computer Science: Students do not do as well on assignments that rely on materials which they must read as they do on assignments based on lecture notes.

Criminal Justice: Most important thing we learned – assessment with rubrics (once the rubric has been created and the process has been set up) is fairly quick and painless to implement.