2

Journalism and Media Studies

Assessment Report for Spring Semester 2008

Submitted February 2009

Semester Assessment Report Form: Spring 2008 Data

DUE October 31, 2008

Directions: Please complete a form for each of the programs within your department. This form was designed to provide a format for assessment reporting and should not be used to limit the amount of information provided. Each box that is attached to each of the sections is designed to adjust to varying lengths. If you have any questions, please contact Dr. Bea Babbitt at x51506 or via email at: .

***Please submit the report electronically to

1. Program Information:

Program / BA, MA
Department / Journalism and Media Studies
College / Urban Affairs
Program Assessment Coordinator
Semester Data Collected / Spring 2008
Report Submitted by
Phone/email
Date Submitted / February 17, 2009

As a relatively new department that has been reviewing its assessment activities with an eye to external accreditation through our major professional accreditation body (ACEJMC), as well as to meet UNLV and regional accreditation requirements, the School of Journalism and Media Studies has designed a multi-pronged approached that when fully implemented will include periodic knowledge surveys of undergraduate and graduate cohorts at key points in the curriculum (illustrated by pilot results presented below for the graduate program), qualitative portfolio and internship evaluations designed to assess professional accomplishments near the point of graduation, a survey of graduating seniors using questions consistent with our stated goals for the undergraduate program (to be conducted by the College of Urban Affairs), and an alumni survey.

During spring semester 2008 our assessment activities took three forms: (1) design and implementation of an initial portfolio evaluation process in the internship class (required of all undergraduate students prior to graduation); (2) redesign of our assessment survey for graduating seniors to bring it into line with our stated and previously adopted curricular goals at the undergraduate level; and (3) an intensive review of knowledge gains in the area of methods by first year MA students to complement the previous semester’s study of knowledge gains in the area of theory.

Due in part to our fall 2008 move into the new Greenspun Hall, the portfolio data have yet to be assembled in summary form; however, two semesters’ worth of individual portfolio results have been furnished to the internship coordinator for her use in improving the program. We have not yet received new exit survey data on graduating seniors from the College, although we understand this system is now in place. Therefore, the report for this period concentrates on the graduate program results.

Here, we present the data from our first-year cohort of MA students enrolled in one or more of the required graduate research methods classes at the beginning and end of the semester. We also present data obtained for a small second-year cohort enrolled in an elective special topics research course for comparison. Averaging across 44 terms for which familiarity was measured on a five-point scale where one is “not familiar, do not know meaning,” three is “recognize term, not certain of meaning/use,” and 5 is “very familiar term, understand how it is used/applied,” we obtained the following results:

Mean Familiarity (Self Reports on 44 Items)

First-year students in methods courses, 3.62

Feb 2008 (N=7)

First-year students in methods courses, 4.13

May 2008 (N=8)

Second-year students in special topics 4.02

course, Feb 2008 (comparison; N=5)

While this is only a pilot test of a partial instrument, we are pleased to note the 14% gain in overall familiarity of our first-year students – especially considering the preliminary version of the instrument used, from which some items will undoubtedly be eliminated in future. The comparison group of second-year students enrolled in a different course also helps establish that our first-year cohort generally ended up around the same point as, or just slightly ahead of, their peers from the year before. This is especially important information because of the relative newness of our MA program.

The greatest gains for first-year students over the semester were shown for items 5 (cluster analysis), 6 (dependent variable), 7 (confidence interval), 12 (descriptive statistics), 28 (snowball sample), and 39 (nonparametric statistics). The lowest absolute scores at the end of the semester for these same students (below 3.25) were on items 8 (interaction effect), 24 (key informant), and 32 (regression analysis), terms which may be stressed in more advanced courses. We present these results primarily to illustrate the value of the approach for purposes of assessment and improvement as we continue to refine it; however, they are also being shared with the faculty who may use them to adjust course content.

The second-year students (in Feb) were ahead of the first-year students at semester end (in May) by an average of 0.5 points or more on only two items: 17 (convenience sample) and 37 (depth interview). The first-year students were ahead by an average of 0.5 points or more on four items: 18 (sampling frame), 23 (discourse analysis), 26 (inductive), and 39 (nonparametric statistics). These results are intriguing, although based on extremely small n’s and a non-random group of second-year students enrolled in a single class.

We also continue to pilot assessment instruments in this same format for additional areas of study (at present, during 2008-2009, for history, law and ethics at the undergraduate level). We are in the process (as of spring 2009) of identifying appropriate points in the undergraduate curriculum at which a combined instrument can be administered that will assess overall progress along multiple dimensions as students move through our program, and of considering a similar process for routine use in the graduate program, which has different goals.

Finally, we also plan to implement a post-graduation alumni survey as part of our overall assessment effort; however, at this time we do not have access to contact information for an appropriate cohort of alumni due to the recent division of communication studies into two departments. (In other words, at the present time we cannot readily identify journalism and media studies alumni from among other communication students from prior years.) We continue to work on this problem.

Once all of these elements are fully in place, our assessment plan will be fully implemented and generating data on a routine basis.