GUIDING QUESTIONS:

ANNUAL SLO ASSESSMENT REPORT

Office of University Assessment

University of Kentucky

* Please note the University is moving to a new reporting system as of April 2017. Only one student learning outcome and method type can be submitted per report. Please consider this as you complete your annual reports.

  1. Student Learning Outcome (SLO)

State the Student Learning Outcome (SLO). It should be clear, measurable, and directly related to student learning. It should be related to students’ performance of knowledge, skills, and abilities, such as papers, projects, or presentations. It should not be related to operational objectives, such as graduation/retention rates or GPAs.

  1. Method Type: (select only one)

This document is approved for One Cycle (three reporting years), August 2015 through October 2018

Direct Artifact

Direct Exam

Direct Portfolio

Direct Other

Indirect Survey

Indirect Focus Group

Indirect Interviews

Indirect Other

This document is approved for One Cycle (three reporting years), August 2015 through October 2018

  1. Rationale for use of assessment tool and how tool aligns to the Student Learning Outcome

Provide a clear description of the assessment tool/activity/method that was used for this assessment cycle. If there is more than one tool/activity/method, describe each one. If a licensing or certification exam is used, a rationale is provided as to why this exam was selected and how it aligns to the student learning outcome.

Explain why the assessment tool/activity/method is appropriate for measuring student learning for the stated outcome.

Did you use any other methods to ensure the validity and reliability of your results and/or findings (e.g., multiple data sources, validation of the tool)?

  1. Target/Benchmark/Goal

Provide the benchmark/target/goal for the assessed student learning outcome. Be specific and explain how the benchmark/target/goal was determined.

  1. Data Collection (includes time/semester and place, sampling process, population description, and data review process)

Provide a complete explanationof each data collection process and protocol so the reviewer fully understands the data collection methodology.

Did you use any processes to ensure the quality of the data (e.g., two or more reviewers,or a different, secondary validation method, Cronbach’s alpha)?

If you used a rubric or scoring guide, is it appended to this report?

  1. Results

Please present your assessment results below in a summary format only. We encourage charts and graphs however they will need to be submitted as an attachment.

Results should be specific and disaggregated in a visual representation (charts and graphs) that is easily understood by an external reviewer. For example, if a rubric was used to assess the student work, break down the results by each achievement category and performance criterion. If a licensing or certification exam is used, the results are disaggregated. For example, the results are broken down by demographics, content areas, or sections. Pass rates should not be the only results provided.

  1. Interpretation and Reflection of Results

Interpretation of Results

Which people/committees/groups participated in the interpretation of the results? How were these results communicated to faculty and/or stakeholders?

Please explain the results. Include things like:

a.Your program’s level of satisfaction with the results

b. An explanation of how the past/current curriculum/co-curriculum might have impacted the results

What are the limitations of this assessment research and/or findings?

Were multiple years of data used to help interpret the results? If so, are then any trends, consistencies, or inconsistencies? If so, please report.

How do the reported results and/or findings effect your improvement actions?

Reflect on your assessment process and results. Do you think these results are valid and/or reliable?

Are the results sufficient to make informed decisions to improve student learning? Why or why not?

Do you plan to make changes to assessment or data collection process(es)?

Reflection of Results

Provide a discussion of your intended improvement actions that focus specifically on student learning. Explain why or how the improvement action is expected to positively affect the learning outcome.

Discuss any causation or associated details identified in your assessment activities (e.g., approximate dates of and person(s) responsible for implementation, and where in curriculum/activities and department/program they will occur; how results and intended improvement actions will impact SLO).

If applicable, provide a discussion of any empirical or research based evidence that supports your intended improvement actions.

  1. Actions Intended for the Improvement of Student Learning

Provide a discussion of your intended improvement actions that focus specifically on student learning. Explain why or how the improvement action is expected to positively affect the learning outcome.

Discuss any causation or associated details identified in your assessment activities (e.g., approximate dates of and person(s) responsible for implementation, and where in curriculum/activities and department/program they will occur; how results and intended improvement actions will impact SLO).

If applicable, provide a discussion of any empirical or research based evidence that supports your intended improvement actions.

  1. Target/Benchmark/Goal Achievement

Did you meet your anticipated target/benchmark/goal: (select only one)

ExceededMetNot Met

  1. Additional Insights or Reflection [This section is not scored]

Are there any insights you would share regarding your assessment efforts?

If you have additional notes regarding your assessment efforts that should be considered in future reflections of this work, please include them below.

Is there any other work being done in the program that may not be directly related to the learning outcome that you would like to share? If so, please provide that information below.

This document is approved for One Cycle (three reporting years), August 2015 through October 2018