Spring 2016-Summer 2016-Fall 2016

Student Learning Outcomes(SLO) Assessment Plan and Report

(Document student learning outcomes assessment plans and assessment data for each undergraduate and graduate degree program and certificate program, stand alone minor, and distance education program offered online only.)

College: ___College of Education______

Department: _Reading and Elementary Education______

Name of Degree or Certificate Program/Stand Alone Minor/Online Distance Education Program:_M.Ed. in Elementary Education

Reflection on theContinuous Improvement of Student Learning
1. List the changes and improvements your program planned to implement as a result of last year’s student learning
outcomes assessment data.
2. Were all of the changes implemented? If not, please explain.
3. What impact did the changes have on student learning?
Each program should refer back to the previous year’s SLO report. For each SLO, there is a box that asks “Changes to be implemented in 2016.” Report on those changes. Note: the changes do not necessarily have to be complete; they may still in be in progress or online. But some progress should be noted.
OUR GOAL FOR ALL PROGRAMS IS TO HAVE SOMETHING TO REPORT ON HERE EACH YEAR. WE MUST BEGIN TO DO A BETTER JOB OF SHARING WITH ACADEMIC AFFAIRS ALL THE IMPROVEMENTS WE ARE MAKING AT THE PROGRAM LEVEL. WE ARE DOING THIS GOOD WORK! LET’S GIVE OURSELVES CREDIT FOR IT. 
Some suggestions for advanced programs (not everyone will have this – many will have something different):
  • Many programs completed the content validity review for all rubrics, making the data collected more meaningful.
  • Many programs added a new data source for SLO 4.
  • Some programs made changes to courses or procedures based on SLO data sources.
[Last year’s (2015-16) reports are available on the S drive at S:  coed shared  Assessment  SACS Reports and Program Review Spring 2016 (reporting on 2015 data)  (find your dept/program).]
Student Learning Outcome 1
(knowledge, skill or ability to be assessed)
SLO 1 (Advanced): Advanced program candidates are able to demonstrate and apply content knowledge and skills specific to their content area or discipline.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
[If changes have been made, please briefly describe them and info the COED Assessment Office immediately. Otherwise, use the sentence below.]
No changes were made to the SLO Assessment plan for SLO 1 since last year’s report.
[Wording Suggestion if you have a new/revised rubric WITH DATA COLLECTED FROM FALL 2016: Revisions were made to the data source used for SLO 1. The rubric for this project was completely revised based on faculty input and content validity results. As this revised rubric was implemented in fall 2016, both the old and new data sources will be provided on this report.]
If you have a new rubric but no data were collected in Fall 2016, just report on your “old” data. You will report the “new” data next year.
Effectiveness Measure: Identifythe data collection instrument, e.g., exam, project, paper, etc. that will beused to gauge acquisition of this student learning outcomeand explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
[Each program should complete this box, describing how you are collecting the data for SLO 1. Be sure to answer all pieces of the prompt above. Feel free to begin with last year’s report and edit as needed]
ELED MEd only-To assess SLO 1, the Teacher Inquiry and Data Analysis Project (TIP) will be used. The TIP is a two part project that collectively is designed to be the capstone experience of this program. The project will be completed in ELED 6203 Instructional Differentiation for 21st Century Learners and ELED 6303 Teacher Inquiry and Data Analysis in the Elementary Classroom, but it will require candidates to synthesize and apply knowledge from other courses (e.g., ELED 6202 Classroom Management and Leadership for Diverse Learners) taken earlier in the program. Through the Teacher Inquiry and Data-Analysis Project, the candidate will show advanced content area knowledge appropriate for masters’ level teacher candidates by developing content-pedagogy strategies specific to the in the instructional needs of the diverse learners they teach, based on the contextual factors of the classroom and the school. Instruction will be designed to reflect cultural needs, learning differences, behavioral differences, and diversity in learning styles. Instruction will reflect expertise in understanding and instructing using higher order thinking skills (revised Bloom’s taxonomy), effective intervention strategies, and teaching to multiple intelligences. For these reasons, the specific indicators on the TIP Project align with the revised SLO 1.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administeredand evaluated. Describe the process the department will use to collect, analyze and disseminatethe assessment data to program facultyand to decide the changes/improvements to make on the basis of the assessment data.
[Each program should complete this box, describing how you are assessing SLO 1. Be sure to answer all pieces of the prompt above.]
ELED MEd only--The TIP Part A (EE 2A) is evaluated in [list course # and title]using a rubric developed by program faculty of the program. There are eight major areas assessed on the TIP Part A (also known as EE2A). Six of these areas are used to measure candidate content and curriculum expertise: 1) 3A Using Content Knowledge; 2) 3C: Identifying Key Concepts; 3) 3D: Identifying Essential Understandings; 4) 3E: Developing Higher Order Lesson Plans; 5) 3F: Using Appropriate Differentiations; 6) 3H: Reflecting on Practice. These areas collectively represent the advanced content knowledge and pedagogical skills that candidates are required to demonstrate before being recommended for master’s-level licensure at the end of their programs. The rubric is on a 3-point scale: 1=Not Met, 2=Proficient, 3=Accomplished. The rubric is completed by the course instructor.
For the TIP Project, scores are collected using the College’s electronic data management system, Taskstream.Scores are provided to program faculty bi-annually by the COED Office of Assessment and Accreditation. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by term at the college and program levels. All data reports created by the College of Education are housed on a secure website which is accessible to faculty within the College of Education.The data is discussed during REEL Leadership Council meetings as well as in the department’s faculty meeting at least once per year. In these meetings, next steps determined to address any needs identified. All strategies determined during this closing the loop discussion are implemented during the next academic year. These meetings are documented by program directors and department chairs and revisited at each subsequent meeting to monitor implementation progress.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects [give percentage – recommend 80% or higher]of its candidates to obtain a score of 2 (“Proficient”) or better on the Instructional Differentiation rubric indicators listed above.

Assessment Data:

Spring 2016-Summer 2016-Fall 2016 Assessment Data

[Directions: READ ALL DIRECTIONS CAREFULLY! – then delete these directions from final report]

  1. Some programs may have a different data source in the fall versus the spring, or from 2015 to 2016. That’s okay! Just be sure to have a data chart for the “old” data and a data chart for any “new” data. Not a problem.

FOR EXAMPLE: you MAY have

Spring 2016 data – “old” data

[insert spring 2016 data chart]

Fall 2016 data –“ new” / revised rubric source

[insert fall 2016 data chart]

If all your data is from the same source, just use a single chart for the entire year (as usual).

  1. PLEASE BE CAREFUL how you copy/paste data. The SLO template does not always align with our data template. The SLO report asks for data by term; we report it by assessment first, then by term. Also, I have tried to keep the reports shorter by combing multiple data sources into a single chart (i.e., Traditional and DE results, all on one chart). Note the Fall 2016 data is reported first in the data chart. I have added extra columns and then moved last year’s data over to the right in the chart.Pay close attention when you copy/reenter data from one spreadsheet to the next.
  1. Not all our data is reported in the SLO charts. Copy and paste carefully!!!!
  1. Data must be reported by “traditional” and “distance education” (if available).
  1. For some data sources, we have used averages in addition to percent proficient. This is to demonstrate some variability in the data and to identify potential areas of improvement.

See example for the ELED MEd below.]

Program / Elementary Education M.Ed.
ELED TIP Part A (EE 2A) / Semester / Fall 2016
DE only / Spring 2016
DE only / Fall 2015
DE only / Spring 2015 TRAD / Spring 2015 DE
Count / 6 / 8 / 6 / 1 / 4
3A Using Content Knowledge / # of 1 score / 1 / 0 / 0 / 0 / 0
% of 1 score / 16.67% / 0.00% / 0.00% / 0.00% / 0.00%
# of 2 score / 1 / 1 / 2 / 0 / 1
% of 2 score / 16.67% / 12.50% / 33.33% / 0.00% / 25.00%
# of 3 score / 4 / 7 / 4 / 1 / 3
% of 3score / 66.67% / 87.50% / 66.67% / 100.00% / 75.00%
3C: Identifying Key Concepts / # of 1 score / 0 / 0 / 0 / 0 / 0
% of 1 score / 0.00% / 0.00% / 0.00% / 0.00% / 0.00%
# of 2 score / 1 / 0 / 0 / 0 / 1
% of 2 score / 16.67% / 0.00% / 0.00% / 0.00% / 25.00%
# of 3 score / 5 / 8 / 6 / 1 / 3
% of 3score / 83.33% / 100.00% / 100.00% / 100.00% / 75.00%
3D: Identifying Essential Understandings / # of 1 score / 0 / 0 / 0 / 0 / 0
% of 1 score / 0.00% / 0.00% / 0.00% / 0.00% / 0.00%
# of 2 score / 1 / 0 / 3 / 0 / 0
% of 2 score / 16.67% / 0.00% / 50.00% / 0.00% / 0.00%
# of 3 score / 5 / 8 / 3 / 1 / 4
% of 3score / 83.33% / 100.00% / 50.00% / 100.00% / 100.00%
3E: Developing Higher Order Lesson Plans / # of 1 score / 0 / 0 / 0 / 0 / 0
% of 1 score / 0.00% / 0.00% / 0.00% / 0.00% / 0.00%
# of 2 score / 2 / 2 / 1 / 0 / 1
% of 2 score / 33.33% / 25.00% / 16.67% / 0.00% / 25.00%
# of 3 score / 4 / 6 / 5 / 1 / 3
% of 3score / 66.67% / 75.00% / 83.33% / 100.00% / 75.00%
3F: Using Appropriate Differentiations / # of 1 score / 2.83 / 2.88 / 0 / 0 / 0
% of 1 score / 94.44% / 95.83% / 0.00% / 0.00% / 0.00%
# of 2 score / 0 / 0 / 1 / 0 / 0
% of 2 score / 0.00% / 0.00% / 16.67% / 0.00% / 0.00%
# of 3 score / 1 / 1 / 5 / 1 / 4
% of 3score / 16.67% / 12.50% / 83.33% / 100.00% / 100.00%
3H: Reflecting on Practice / # of 1 score / 3.00 / 2.88 / 0 / 0 / 0
% of 1 score / 100.00% / 95.83% / 0.00% / 0.00% / 0.00%
# of 2 score / 0 / 0 / 3 / 0 / 1
% of 2 score / 0.00% / 0.00% / 50.00% / 0.00% / 25.00%
# of 3 score / 0 / 1 / 3 / 1 / 3
% of 3score / 0.00% / 12.50% / 50.00% / 100.00% / 75.00%

Note: if you have two data sets because of the rubric change (i.e., one for “old” data and one for “new” from fall 2016), add the second chart here. This will only occur if you collected Fall 2016 data on the new rubric.

Changes to be implemented Fall 2017: Based upon the 2016assessment data included in this annual report, what changes/improvements willthe programimplementduring the next academic yearto improve performance on this student learning outcome?
[This will be specific to each program. We should all have some discussions from our data meetings in 2016-17. Please complete this section as appropriate. Refer to the goals developed in your data meetings if related to these data.]
ELED MEd needs to finish this section for this report.
Student Learning Outcome 2
(knowledge, skill or ability to be assessed)
SLO 2 (Advanced):Advancedprogramcandidatesusedomain-specific researchand evidenceto demonstrateleadershipin developinghighquality learning environments.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
[If changes have been made, please briefly describe them and info the COED Assessment Office immediately. Otherwise, use the sentence below.]
No changes were made to the SLO Assessment plan for SLO 2 since last year’s report.
[Wording Suggestion if you have a new/revised rubric WITH DATA COLLECTED FROM FALL 2016: Revisions were made to the data source used for SLO 2. The rubric for this project was completely revised based on faculty input and content validity results. As this revised rubric was implemented in fall 2016, both the old and new data sources will be provided on this report.]
If you have a new rubric but no data were collected in Fall 2016, just report on your “old” data. You will report the “new” data next year.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will beused to gauge acquisition of this student learning outcomeand explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
[Each program should complete this box, describing how you are collectingthe data for SLO 2. Be sure to answer all pieces of the prompt above. Feel free to begin with last year’s report and edit as needed]
For ELED MEd only--To assess SLO 2, the Teacher Inquiry and Data Analysis Project is a two part project that collectively is designed to be the capstone experience of this program. The project will be completed in ELED 6203 Instructional Differentiation for 21st Century Learners and ELED 6303 Teacher Inquiry and Data Analysis in the Elementary Classroom, but it will require candidates to synthesize and apply knowledge from other courses (e.g., ELED 6202 Classroom Management and Leadership for Diverse Learners) taken earlier in the program. In Part B of the project, candidates will demonstrate their impact on student learning through pre/post-test and formative assessment data. Candidates collect these data within their own classrooms and then share these findings with colleagues. For these reasons, specific indicators on the TIP Project align with the revised SLO 2.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
[Each program should complete this box, describing how you are assessing SLO 2. Be sure to answer all pieces of the prompt above.]
ELED MEd only--The TIP Part A (EE 2A) is evaluated in [list course # and title]using a rubric developed by faculty of the program. The TIP Part B (EE 2B) is evaluated in [list course # and title]. There are eight major areas assessed on the TIP Part A (also known as EE2A). Two of these are used to assess the candidate’s ability to facilitate learning through evidence-based practice informed by research: 1) 3B: Using Research Based Evidence; and 2) 3G:Effective Use of Diagnostic Data. There are three areas assessed on the TIP Part B, and all three are used to assess SLO 2: 1) 4A: Using Formative Data to Modify Instruction; 2) 4B: Using Summative Data to Analyze Student Growth; and 3) 4C: Making Recommendations Based on Data. These areas collectively represent the ability to conduct and use evidence-based research to demonstrate leadership skills that advanced program candidates are required to demonstrate before being recommended for master’s-level licensure at the end of their programs. The rubric is on a 3-point scale: 1=Not Met, 2=Proficient, 3=Accomplished. The rubric is completed by the course instructor.
For the TIP, scores are collected using the College’s electronic data management system, Taskstream. Scores are provided to program faculty bi-annually by the COED Office of Assessment and Accreditation. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by term at the college and program levels. All data reports created by the College of Education are housed on a secure website which is accessible to faculty within the College of Education. The data is discussed during REEL Leadership Council meetings as well as a department’s faculty meeting at least once per year. In these meetings, next steps determined to address any needs identified. All strategies determined during this closing the loop discussion are implemented during the next academic year. These meetings are documented by program directors and department chairs and revisited at each subsequent meeting to monitor implementation progress.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects [give percentage – recommend 80% or higher]of its candidates to obtain a score of 2 (“Proficient”) or better on the TIP rubric indicators listed above.

Assessment Data:

Spring 2016-Summer 2016-Fall 2016 Assessment Data

[Directions: READ ALL DIRECTIONS CAREFULLY! – then delete these directions from final report]

  1. Some programs may have a different data source in the fall versus the spring, or from 2015 to 2016. That’s okay! Just be sure to have a data chart for the “old” data and a data chart for any “new” data. Not a problem.

FOR EXAMPLE: you MAY have

Spring 2016 data – “old” data

[insert spring 2016 data chart]

Fall 2016 data –“ new” / revised rubric source

[insert fall 2016 data chart]

If all your data is from the same source, just use a single chart for the entire year (as usual).

  1. PLEASE BE CAREFUL how you copy/paste data. The SLO template does not always align with our data template. The SLO report asks for data by term; we report it by assessment first, then by term. Also, I have tried to keep the reports shorter by combing multiple data sources into a single chart (i.e., Traditional and DE results, all on one chart). Note the Fall 2016 data is reported first in the data chart. I have added extra columns and then moved last year’s data over to the right in the chart.Pay close attention when you copy/reenter data from one spreadsheet to the next.
  1. Not all our data is reported in the SLO charts. Copy and paste carefully!!!!
  1. Data must be reported by “traditional” and “distance education” (if available).
  1. For some data sources, we have used averages in addition to percent proficient. This is to demonstrate some variability in the data and to identify potential areas of improvement.

See example for the ELED MEd below.]