Johns Hopkins University School of Education

Assessment Retreat Part I

May 20, 2014 - 10 a.m. – 4:p.m.

Education Building Room 114

Attendees: Linda Adamson, Mary Ellen Beaty-O’Ferrall, Ira Blatstein, Anne Cash, Laurie DeBettencourt, Joan DeSimone, Rhodri Evans, Ileana Gonzalez, Annette Henderson, Cheryl Holcomb-McCoy, Phyllis McDonald, Tamara Marder, Sharon Morris, Amy Nicholas, Laura Owen, Carolyn Parker, Eric Rice, Margaret Shamer, Bill Sowders, Julianne Taylor, Toni Ungaretti

Welcome and introductions – Plan for meeting

1.  Review School’s Assessment System

2.  Overview of Tk20 Data Management System

3.  Plan for CAEP preparation and program data requirements

Also, Assessment Retreats will be held in fall, rather than spring, starting October, 2014, in order to provide an opportunity to review spring data and plan.

I.  Identify Unit –Wide Assessments – Review of handout “Assessment Transition Points and Evidence”

o  Illustrates assessments at Admissions, Mid-program, Program Completion, Post-graduation (2 years out) Post graduation (5 years out).

o  Program impact is measured post graduation for to respond to accreditation requirements.

o  Task is to finalize assessments that will be common to all programs across the School of Education.

o  Each program will use the school-wide assessments in addition to the assessments that are specific to their own programs. These will be included in the CAEP report.

A.  Admissions Assessments:

Personal Essay – Should there be a set of criteria for evaluating the essays that all programs that all programs use.

Student Interview and Disposition Survey

o  Significance is to provide a measure that ensures dispositions are aligned with School of Education’s Conceptual Framework and provides a common assessment.

o  Different expectations on applicant responses in each program

-  Does the survey reflect an applicant at the entry level of their program, or something else?

-  Some students give the respond in ways that they understand is “acceptable”

-  International students do not understand cultural context to respond

-  In some programs, the interview process has little bearing on whether to admit and does not serve as a “baseline” measure.”

-  Baseline measures are sometimes part of the initial coursework and demonstrated progress toward competencies and disposition.

o  The survey’s application to various programs will have different interpretations and scoring, but having an assessment where data is collected is important to provide evidence the programs across the school align with the Conceptual Framework.

o  There needs to be consistency in scoring and in the labeling for each level of scoring.

o  It is important to consider the disposition survey questions as applicable to the school as a whole, collect data across the programs, but be able to disaggregate to evaluate more closely.

Next Steps – Admissions

1)  Touch base with admissions office to build in the 5-question disposition survey as a requirement for admission, in addition to essay.

2)  Responses to questions can be reviewed and validated during the admissions process/interview.

3)  Scoring of survey included in scoring of application.

4)  Disposition questions would be followed up if there were concerns in the interview.

5)  Revise school-wide admission assessments to include “audition” for Education Preparation programs

B.  End of Program Assessments

1)  Common end of program assessment includes an exit survey that captures conceptual framework data. A survey to capture conceptual framework data is also administered 2 year and 5 years post-graduation.

2)  Second common assessment is the culminating project, i.e. (e-portfolio, comp exams, capstones, internships, dissertations)

3)  Licensure exam: Praxis, CPEC exam, Adm. Supervision, MLLC exam

4)  Each program needs two assessments, and the exit survey would be the third that would enable triangulation of data.

C.  Mid-Point Assessments

1)  Faculty leads suggested advisor “check-in” at mid-point to assess students’ progress/performance toward degree completion.

2)  The process could be a form letter, or a more formal process depending on program’s specific needs or requirements.

Next Steps - Assessment Transition Points

Each program will create a program specific map that identifies the points where assessments are taken using the transition assessment chart as a guide.

1)  All assessments must be relevant, specific, and align to standards and/or program goals.

2)  Assessments cannot be so general that they measure for multiple standards or program goals.

3)  Identify changes made in your program this past academic year or plan to make over the summer. This data is needed for CAEP program by JUNE for CAEP report.

4)  Demonstrate that changes are deliberate and reasoned, and based on data indicators, i.e., data, stakeholder feedback, student performance, industry trends, policy change, etc.,

5)  This data is needed by June for CAEP programs in order to prepare CAEP report. Forward to Toni Ungaretti via e-mail.

II.  Tk20 Workshop

Tk20 at the SOE has only utilized a small portion its many data management capabilities. In the next few months the SOE will implement a more comprehensive package that will streamline data collection across all academic programs. The many advantages to full implementation include:

1)  uniform assessments and measures across programs that ensure alignment with the SOE’s Comprehensive Assessment Plan;

2)  greater access and reliability of data;

3)  timely report production that respond to specific accreditation standards;

4)  support can be made available by granting access to support staff or other faculty for specific tasks;

5)  Convenience of centralized data collection that makes compilation of data from varied sources unnecessary.

·  Features of expanded Tk20

o  Although not a learning management system (LMS) all assignments, artifacts, quizzes, etc. associate with courses can be loaded into Tk20 Course Binders.

o  Standards associated with courses can be linked and visible as specific assignments are evaluated.

o  Student assignments can be graded, commented on, and

o  Access to various aspects of courses, surveys, portfolios, etc. can be granted to a wide range of supporting individuals by granting a “role” to complete an assigned task.

o  Students will “own” their Tk20 account and assume a more active role.

·  Getting started with Tk20

o  The log-in portal for faculty/instructor contains specific instructions and helpful links.

o  Each instructor has a dashboard that will alert them to pending tasks and communications received.

o  Tk20 terminology is somewhat different from the terminology used at JHU SOE. A glossary of terms was provided but is also available in the Tk20 Users Manual.

o  The Help Link on the Tk20 log-in page is comprehensive. It provides how-to videos and “cheat-sheets” to guide new users.

o  Several Tk20 training sessions are being planned for SOE faculty and staff in the late summer that will be conducted by Tk20 training professionals.

o  In addition, there will be weekly training/help sessions for faculty, students and staff beginning in the Fall.

o  Juli/Maggie will work with instructors to assist with course set-up.

·  Training Site: https://training5.tk20.com

o  The training site will be available until June 3, 2014. All are encouraged to utilize the ID and passwords to log-in as instructors, students, to view how a course site is organized and how it will be viewed by others with differing Tk20 roles.

o  Tk20 is tab oriented – use the tab at the top of the page to navigate.

o  Important to remember that it is virtually impossible to crash the system.

Final Comments:

-  Instructors asked to provide a final version in a form layout to instruct how assessment should appear when opened in Tk20. An example was provided as a handout. This is an important and time-saving requirement.

-  Tk20 offers the best integration of data but sometimes sacrifices some better (homegrown) system components. However, these systems required data compilation from various sources, which has been time-consuming and error-prone.

-  Already, Tk20 has enabled a faster assembly of data. Previously, data compilation took several months but for today’s meeting, data was assembled within one week.

-  Student ownership of personal accounts makes data management in Tk20 very cost-effective for the School of Education. Subscriptions follow students even when they leave, but SOE owns the specific data.

-  Transition to Tk20 will require ongoing data loads and training over the next few years. But, the availability of complete data to work with will make the process much less laborious, much more interesting, even fun.

Alumni survey – Will be sent to all participants electronically.

-  Poor response due to length and due to required responses that did not direct survey-takers to questions missed.

-  Results indicate:

o  Satisfaction overall with program, program delivery, content and advising

o  Degrees are relevant to current employment

o  Responses to knowledge base (content expert/reflective practitioner) question is very

skewed. The reason may relate to specific requirements not related to program content but required in employment situation.

o  Contribution to professional disposition – may need to be captured in a different way

Preliminary Data Sets – Send e-mail with Program Improvement Plan and key questions to be answered.

-  Fall 2013 and Spring 2014 course and portfolio data were sent to program leads advising them to review it in preparation for the Assessment Retreat.

-  Review data and determine how it informs decisions for your program improvement plan, i.e., strengths, needs, and recommendations for change.

-  Needs to be included in CAEP report - request by July 1, 2014.