Spring 2015 and Fall 2015
Student Learning Outcomes Assessment Plan and Report
College: College of Education
Department: Special Education and Child Development
Name of Program: Online Graduate Certificate in Special Education—Academically/Intellectually Gifted (AIG)
Reflection on the Continuous Improvement of Student Learning1. List the changes and improvements your program planned to implement as a result of last year’s student learning
outcomes assessment data.
2. Were all of the changes implemented? If not, please explain.
3. What impact did the changes have on student learning?
Although all student learning outcomes were met for 2014 and 2015 for all SLOs, changes were made in beginning in late 2015 to be phased in during 2016, as described in the sections below.
Student Learning Outcome 1
(knowledge, skill or ability to be assessed)
Revised SLO 1: Advanced program candidates are able to demonstrate and apply content knowledge and skills specific to their content area or discipline.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
In 2013, the College of Education accrediting body, the Council for the Accreditation of Educator Preparation (CAEP), released new standards for educator preparation programs. To better align with these standards, AIG program faculty have collaboratively worked this year to revise our Student Learning Outcomes (SLOs). In addition, the UNC Charlotte Office of Assessment recommends that programs revisit SLOs every 3-5 years to ensure that SLOs accurately assess student learning. As a result, SLO 1 is being changed.
Reporting for 2015 used the existing data to address this student learning outcome. To assess the revised SLO 1 during 2015, one data source was used: The Topical/Case Study paper. Data for 2015 and later are reported as disaggregated within each response category of the Taskstream rubric.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
In the Topical/Case Study paper the candidate demonstrates and applies the ability to select and explain a relevant topic in the content area of gifted education. The project rubric also addresses the ability to express content knowledge and skills in written form, and the ability to reflect on the application of content area and disciplinary knowledge. Together, these areas demonstrate the candidate’s in depth knowledge of AIG learners and pedagogy and their ability to apply effectively the knowledge and skills specific to their content area discipline in addressing the learning needs of students identified as AIG.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
The Topical/Case Study project the major course project and is completed by the add-on licensure candidate in SPED 5211 Nature and Needs for the Gifted in the first year of the four-course licensure program. The Topical/Case Study paper is evaluated by the instructor using a set of criteria on a rubric that result in a grade. Grades are converted from a point total to a percentage score and scores of 80% or higher are given a grade of B, with 90% or higher given a grade of A. For the purpose of this report, data reflect points earned at 80% or better (grades of ‘A’ or ‘B’). An overall score of B or higher indicates a student product whose item-level scores for Professional Project, Nature of the Intervention, and Connections to Course Content at or above 80 percent. These are recorded on the data management system’s rubric as Proficient or better.
Scores are collected using the College’s electronic data management system and are analyzed at the college and program level. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by semester at three levels (College, Program and Licensure Area). Once a year results from all assessments administered by the programs are disseminated to the faculty in the College of Education. The data is discussed during a final faculty meeting and next steps determined to address any needs identified. All strategies determined during this closing the loop discussion are implemented during the next academic year. All data reports created by the College of Education are housed on a secure website which is accessible to faculty members within the College of Education.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects 80% of its add-on licensure candidates to score “B” or better on the instructional rubric for their work to be considered proficient on the Topical/Case Study project. This is confirmed by achieving scores of 2 or above in each rubric area on the Taskstream rubric, which has a 3-point scale for each rubric item (0-3).
Assessment Data: NOTE: our data sources and reporting procedures have changed; therefore there are two tables here – one for Spring 2014-Fall 2014 data, and one for Spring 2015-Fall 2015 data.
SLO 1: SPED 5211 Topical/Case Study Paper/Content Knowledge Rubric
Program / AIG GC / AIG GC / AIG GC
Semester / Spring 2014 / Summer 2014 / Fall 2014
Total Count / Not offered / 25 / 32
Met Standard / N/A / 23 / 31
Percentage Met / -- / 92% / 97%
Spring 2015-Fall 2015 Assessment Data
Semester / Spring 2015 / Fall 2015
Distance Ed only / Summer I 2015 Traditional / Summer I 2015 Distance Ed
EE2 Academically & Intellectually Gifted Add-on / Count / Not offered / 19 / 4 / 32
3.b.1.: Written Expression / Average / 2.16 / 2.75 / 2.19
% Met Expectations (2 or 3) / 84.21% / 100% / 90.63%
# of Candidates Scoring a 1 / 3 / 0 / 3
% of Candidates Scoring a 1 / 15.79% / 0.00% / 9.38%
# of Candidates Scoring a 2 / 10 / 1 / 20
% of Candidates Scoring a 2 / 52.63% / 25.00% / 62.50%
# of Candidates Scoring a 3 / 6 / 3 / 9
% of Candidates Scoring a 3 / 31.58% / 75.00% / 28.13%
Topic / Average / 2.42 / 2.75 / 2.56
% Met Expectations (2 or 3) / 100% / 100% / 96.88%
# of Candidates Scoring a 1 / 0 / 0 / 1
% of Candidates Scoring a 1 / 0.00% / 0.00% / 3.13%
# of Candidates Scoring a 2 / 11 / 1 / 12
% of Candidates Scoring a 2 / 57.89% / 25.00% / 37.50%
# of Candidates Scoring a 3 / 8 / 3 / 19
% of Candidates Scoring a 3 / 42.11% / 75.00% / 59.38%
Content / Average / 2.58 / 2.25 / 2.44
% Met Expectations (2 or 3) / 100% / 100% / 93.75%
# of Candidates Scoring a 1 / 0 / 0 / 2
% of Candidates Scoring a 1 / 0.00% / 0.00% / 6.25%
# of Candidates Scoring a 2 / 8 / 3 / 14
% of Candidates Scoring a 2 / 42.11% / 75.00% / 43.75%
# of Candidates Scoring a 3 / 11 / 1 / 16
% of Candidates Scoring a 3 / 57.89% / 25.00% / 50.00%
Reflection / Average / 2.89 / 2.50 / 2.31
% Met Expectations (2 or 3) / 100% / 100% / 90.63%
# of Candidates Scoring a 1 / 0 / 0 / 3
% of Candidates Scoring a 1 / 0.00% / 0.00% / 9.38%
# of Candidates Scoring a 2 / 2 / 2 / 16
% of Candidates Scoring a 2 / 10.53% / 50.00% / 50.00%
# of Candidates Scoring a 3 / 17 / 2 / 13
% of Candidates Scoring a 3 / 89.47% / 50.00% / 40.63%
3.b.1.: Summative Rating / Average / 2.58 / 2.00 / 2.06
% Met Expectations (2 or 3) / 100% / 100% / 96.88%
# of Candidates Scoring a 1 / 0 / 0 / 1
% of Candidates Scoring a 1 / 0.00% / 0.00% / 3.13%
# of Candidates Scoring a 2 / 8 / 4 / 28
% of Candidates Scoring a 2 / 42.11% / 100.00% / 87.50%
# of Candidates Scoring a 3 / 11 / 0 / 3
% of Candidates Scoring a 3 / 57.89% / 0.00% / 9.38%
Other Program Expectations / Average / 2.21 / 2.50 / 2.25
% Met Expectations (2 or 3) / 100% / 100% / 90.63%
# of Candidates Scoring a 1 / 0 / 0 / 3
% of Candidates Scoring a 1 / 0.00% / 0.00% / 9.38%
# of Candidates Scoring a 2 / 15 / 2 / 18
% of Candidates Scoring a 2 / 78.95% / 50.00% / 56.25%
# of Candidates Scoring a 3 / 4 / 2 / 11
% of Candidates Scoring a 3 / 21.05% / 50.00% / 34.38%
Changes to be implemented Fall 2016: Based upon the 2015 assessment data included in this annual report, what changes/improvements will the program implement during the next academic year to improve performance on this student learning outcome?
Data indicated that candidates in the AIG Graduate Certificate program met the targeted performance outcomes for revised SLO 1. However, the College of Education is focused on continuous improvement based on data-based decision-making. Based on the data presented here, faculty will review any rubric category in which the rate of meeting expectations is less than 100% across two or more groups. For 2015 data, this would include 3.b.1.: Written Expression (84 and 91 percent passing for F15-DE and SumI-DE sections). However, as this rate is above the 80% target in both groups, no changes are identified to be made at this time.
Beginning with 2016 this SLO1 evidence will be replaced by the AIG Models Original Lesson Plan, which will be completed by candidates in the course SPED 6124: Methods and Materials for the Gifted.
Student Learning Outcome 2
(knowledge, skill or ability to be assessed)
Revised SLO 2: Advanced program candidates use domain-specific research and evidence to demonstrate leadership in developing high quality learning environments.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
In 2013, the College of Education accrediting body, the Council for the Accreditation of Educator Preparation (CAEP), released new standards for educator preparation programs. To better align with these standards, the College of Education faculty have collaboratively worked this year to revise our Student Learning Outcomes (SLOs). In addition, the UNC Charlotte Office of Assessment recommends that programs revisit SLOs every 3-5 years to ensure that SLOs accurately assess student learning. As a result, SLO 2 has been changed as indicated above.
To assess the revised SLO 2, an existing data source was identified: 1) The Gifted Workshop Project completed by students in SPED 5211 Nature and Needs of Gifted Students. This replaced the lesson plan from SPED 6124, which was used for SLO2 during 2014. Specific indicators as aligned with SLO 2 are indicated below. These indicators are different from the 2014 data report.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
In the SPED 5211 Workshop Project, candidates develop materials for a one-hour workshop for teachers at their school that addresses definitions of giftedness, characteristics of gifted students, North Carolina’s gifted legislation (Article 9B), issues gifted students face, and what programming for the gifted entails in the candidates’ school district. Candidates’ performance on this task is assessed by a rubric addressing their selection of data-based research supporting the topic of their workshop, their understanding of the learning and other needs of AIG learners, the inclusion of current policies, standards, and issues affecting the education of AIG learners, and candidates’ professional leadership as demonstrated in their ability to design collaborative learning activities and professional reflection experiences for the workshop’s intended audience. Candidates design the workshop to address needs they observe within their own school and then share these with colleagues. For these reasons, specific indicators on the Workshop Project Rubric align with the revised SLO 2.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
The Workshop Project is evaluated using a rubric developed by faculty of the AIG program in SPED 5211: Nature and Needs of Gifted Learners. The rubric items address the competencies described above under the heading Effectiveness Measures. The rubric is on a 5-point scale: 0 = Not Observed, 1 = Emergent/Developing; 4 = Proficient; and 5 = Accomplished.
Scores of Proficient or Accomplished (4 or 5 points) are considered to meet the program’s standards, while scores of 0 or 1 are considered Not Met. Students are provided ahead of time with a copy of the rubric and specific instructions for completion of the Workshop Project. The rubric is completed by the course instructor. Point values on the rubric are designed to also be used for grading purposes for this assignment, which is why the 5-point scale was adopted by program faculty.
Data for 2015 were collected manually based on students’ grades on this assignment and overall performance was discussed by program faculty following each semester in which the course was offered. After 2015, rubric scores will be collected using the College’s electronic data management system, Taskstream, using a revised rubric being prepared for this purpose. Scores will be provided to program faculty bi-annually by the COED Office of Assessment and Accreditation. Simple descriptive statistics are used to analyze the scores, and disaggregated findings are reported by term at the college and program levels. All data reports created by the College of Education will be housed on a secure website which is accessible to faculty within the College of Education. The data collected in Taskstream will be discussed during AIG Program meetings as well as at the department’s faculty meeting at least once per semester. In these meetings, next steps determined to address any needs identified. Strategies determined during this closing the loop discussion will be implemented during the next academic year. These meetings are documented by program directors and department chairs and candidate performance will be revisited at each subsequent meeting to monitor implementation progress.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects 80% or higher of its candidates to obtain an average score of 4 (“Proficient”) or higher overall and across the Workshop Project rubric indicators listed above.
Assessment Data: NOTE: Because our data sources have changed, there are two tables here – one for Spring 2014-Fall 2014 data, and one for Spring 2015-Fall 2015 data. Rubrics were not yet finalized when 2015 SLO2 assignments were collected, so overall scores are reported in lieu of performance on individual rubric items.