Doctoral Programs – 2012-13 Annual Report
2012-13Student Learning Outcomes Assessment Plan and Report
College: College of Education
Department:Educational Leadership
Name of Degree or Certificate Program/Stand Alone Minor/Online Distance Education Program: EdD in Educational Leadership
Reflection on theContinuous Improvement of Student Learning1. List the changes and improvements your program planned to implement as a result of last year’s student learning
outcomes assessment data.
2. Were all of the changes implemented? If not, please explain.
3. What impact did the changes have on student learning?
All the changes were made and have been in effect since the start of Spring 2012 semester.
1. All Performance Outcomes were met
2. No changes were required.
3. Learning outcomes continue to be strong
Student Learning Outcome 1
(knowledge, skill or ability to be assessed)
Candidates for other professional school roles demonstrate an understanding of the professional and contextual knowledge expected in their fields; and use data, current research and technology to inform their practices.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
Implemented new scoring rubrics for student learning outcomes. The new rubrics are Qualifying Exam, Proposal Defense, and Dissertation Defense. No changes were needed in effectiveness measures, methodology or performance outcomes.
Effectiveness Measure: Identifythe data collection instrument, e.g., exam, project, paper, etc. that will beused to gauge acquisition of this student learning outcomeandexplain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
The Department of Educational Leadership (EDLD) program uses the following measures to assess SLO
- Qualifying Examination.The Qualifying Exam may be taken after the candidate completes a minimum of 24 credit hours and before they take 36 hours. The exam has two parts: the written portion, over approximately 12 hours, and then that is followed by an oral defense of the written work. The six components of the rubric are applied to the candidate’s combined performance on both the written and the oral exam. Specific areas assessed are:
- Ability to recognize and articulate the problems at hand.
- Expression of the problems’ background; able to employ critical analysis and relevant literature.
- Reasoning skills
- An understand and ability to apply appropriate research methods vis-à-vis problems posed during exam.
- Apply critical reflection to knowledge gained from the academic program.
- Ability to effectively respond to scholarly questions.
- Proposal Defense. (The proposal is a draft of the first three chapters of candidate’s dissertation.) Results of the proposal defense are used to estimate the candidates’ ability to design a research project that answers important questions in the candidates’ content area. Specific areas assessed are:
- A research problem which is clear, articulated and significant.
- Research methods which provide detailed description of (if applicable): subjects, design/approach, methods/procedures and analyses.
- Research methods and analyses that are appropriate to the research questions.
- A relationship between the research problem and the student’s role as an educational leader.
- A preliminary literature review that describes prior conceptual and research investigations of the research problem.
- Dissertation Defense. (Completion of research and includes five chapters.) Results from the dissertation defense are used to determine the knowledge and skills of the candidate to conduct a research project.
- Develops clear and appropriate research questions or hypotheses that guide the study.
- Demonstrates how research questions or hypotheses have been examined in previous studies.
- Analysis is comprehensive, complete, sophisticated, and convincing.
- All pertinent results reported in clear and concise manner. Table/figures are labeled appropriately.
- Draws clear conclusions based on collected data that answers research question or test the hypotheses.
- Makes recommendations for further research that can build on this project.
- Provides reflection of problems or errors in the study and discusses how they could be avoided in subsequent studies.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administeredand evaluated. Describe the process the department will use to collect, analyze and disseminatethe assessment data to program facultyand to decide the changes/improvements to make on the basis of the assessment data.
Qualifying Examination:
Written component. Candidates should take the written comprehensive examination as soon as possible after completing 24 credit hours of foundations and research coursework and not later than enrollment in ADMN 8699 (Dissertation Proposal Seminar). The examination may occur at any time during the year and normally will include questions from six different doctoral courses to be completed within twelve hours (six hours on two consecutive days). Those questions will require candidates to connect basic concepts from completed coursework and to apply what they have learned to different situations and educational contexts. A committee consisting of a candidate’s advisor and the faculty members who have instructed the candidate will prepare and evaluate the written examinations. Steps in the Written Examination: 1) The candidate and advisor will determine a date for the examination which will be at least 60 days from the day of the decision; 2) The advisor will notify committee members that the student will take the examination and will request any materials/information (if appropriate) to guide the candidate’s preparation; 3) The candidate will take the examination in the department area on a department laptop computer (unless otherwise indicated by a faculty member, no materials or resources will be used during the examination); 4) The advisor will give the candidate’s responses to the examination questions to the appropriate faculty members for evaluation; 5) If the candidate’s performance on the written examination is unsatisfactory (Not Acceptable), in whole or in part, the candidate will be allowed to re-take the failed portion(s) of the examination. A second failure will result in termination from program. The written examination is scored on a 3-point scale (Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2) across multiple dimensions.
Oral component.The oral examination will normally occur within 30 days upon successful completion of the written examination. During the oral examination, the candidate’s advisor and committee will engage in dialog with the candidate about the written examination. The discussion has two purposes. First, it provides an opportunity for the candidate to address in more detail or to clarify responses to questions on the written examination. Second, it allows the committee to engage the student in a discussion of issues not addressed in the written examination but which are pertinent to the content. If the candidate’s performance on the oral examination is unsatisfactory, an additional oral examination may be scheduled and/or the candidate may be required to take additional coursework. Subsequent failure on the oral examination will result in termination from the program. Upon successful completion of the written and oral examinations, the student’s advisor and committee must sign and submit the Qualifying Examination/Comprehensive Examination Report for Doctoral Candidates. The oral exam is scored on a 3-point scale (Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2) across multiple dimensions. A rubric is used to evaluate the combination of written and oral defense of the qualifying exam. It contains the six assessment dimensions mentioned earlier.
Proposal Defense: The development and defense of a dissertation proposal is an important aspect of dissertation research. The proposal is a draft of the first three chapters of one’s dissertation. The proposal defense is scored on a 3-point scale (Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2), across multiple dimensions. After the student/candidate “meets” or “exceeds” all dimensions, they are allowed to begin their research. A rubric is used to evaluate each of the five domains in the proposal defense.
Dissertation Defense: When the candidate’s dissertation committee believes that the dissertation is in satisfactory form, a final defense is scheduled. The dissertation defense is scored on a 3-point scale (Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2), across multiple dimensions. Students/candidates who do not meet expectations are provided feedback and another defense is scheduled. A rubric is used to evaluate each of the seven domains in the dissertation defense.
Assessments are administered at identified points during the program. Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Disaggregated findings are disseminated to faculty, reported at the program and College level, and discussed at monthly Doctoral Advisory Committee meetings and yearly during faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
The program expects at least 80% of the students to score “1” or “2” (meet or exceed expectations) on a 3-point scale on each of the elements of the QualifyingExam, Proposal Defense, and Dissertation Defense. The results indicated that candidates’ performance exceeded the expectations.
2011Assessment Data / 2012 Assessment Data
Written Comprehensive Examination: The percentages of candidates who “failed to meet,” “met,” or “exceeded” expectations are reported in the table below. One hundred percent of the candidates scored at the “met” or “exceeded” level for all dimensions.
Program Year 2011N=10
Not Meet / Meet / Exceed
Demonstrated ability to used research methods to answer important questions in their professional field. / 0%
(0/10) / 90%
(9/10) / 10%
(0/10)
Demonstrates an ability to analyze data for making decisions in professional field. / 0%
(0/10) / 100%
(10/10) / 0%
(0/10)
Demonstrates knowledge of
leadership theory, principles, and practices / 0%
(0/10) / 80%
(8/10) / 20%
(2/10)
Demonstrates the ability to apply leadership theory and principles to the leadership role expectations of leaders in formal organizations. / 10%
(1/10) / 70%
(7/10) / 20%
(2/10)
Demonstrates the ability to understand an executive’s role in bringing about needed change in an organization. / 0%
(0/10) / 70%
(7/10) / 30%
(3/10)
Demonstrates the ability to assess subordinate performance in terms of organizational goal attainment. / 0%
(0/10) / 100%
(10/10) / 0%
(0/10)
The student demonstrates knowledge of adult learning theories. / 0%
(0/10) / 90%
(9/10) / 10%
(1/10)
The student demonstrates the ability to compare adult learning theories and explain how they can apply in instructional settings with adult learners. / 10%
(0/10) / 80%
(8/10) / 10%
(1/10)
Oral Comprehensive Examination: The percentages of candidates who “failed to meet,” “met,” or “exceeded” expectations are reported in the table below. One hundred percent of the candidates scored at the “met” or “exceeded” level for all dimensions.
Program Year 2011N=10
Not Meet / Meets / Exceed
Verbally able to examine how research methods can be used for addressing important questions in professional field. / 0%
(0/10) / 90%
(9/10) / 10%
(1/10)
Verbally able to discuss the ability to analyze data for making decisions in professional field. / 0%
(0/10) / 90%
(90/10) / 10%
(1/10)
The student has the ability to articulate a sound, coherent research based leadership philosophy. / 0%
(0/10) / 60%
(6/10) / 40%
(4/10)
The student is able to verbally respond to various leadership scenarios that leaders in education and related settings are likely to encounter. / 0%
(0/10) / 70%
(7/10) / 30%
(3/10)
The student can verbally identify curriculum constructs that are appropriate in different organizations. / 0%
(0/10) / 100%
(10/10) / 0%
(0/10)
The student can verbally discuss leadership practices that facilitate curriculum change within professional organizations. / 0%
(0/10) / 90%
(9/10) / 10%
(1/10)
The student is able to articulate an executive’s role in bringing about needed change in an organization. / 0%
(0/10) / 60%
(6/10) / 40%
(4/10)
The student is able to articulate various means of performance evaluation which can be used to improve subordinate behavior. / 0%
(0/10) / 90%
(9/10) / 10%
(1/10)
The student is able to articulate the differences in several adult learning theories from a practical application perspective. / 10%
(1/10) / 80%
(8/10) / 10%
(1/10)
The student is able to respond to various possible situations that they may encounter in their leadership roles and articulate how adult learning theories can be applied to various instructional environments. / 0%
(0/10) / 70%
(7/10) / 30%
(3/10)
Qualifying Examination 2012:
Program Year 2012N=13
Not Meet / Meets / Exceed
Ability to recognize and articulate the problems at hand. / 15%
2/13 / 62%
8/13 / 23%
3/13
Expression of the problems’ background; able to employ critical analysis and relevant literature. / 23%
3/13 / 62%
8/13 / 15%
2/13
Reasoning skill / 23%
3/13 / 54%
7/13 / 23%
3/13
An understand and ability to apply appropriate research methods vis-à-vis problems posed during exam / 25%
3/12 / 67%
8/12 / 8%
1/12
Apply critical reflection to knowledge gained from the academic program. / 23%
3/13 / 54%
7/13 / 23%
3/13
Ability to effectively respond to scholarly questions / 15%
2/13 / 77%
10/13 / 8%
1/13
Proposal Defense: The percentages of candidates who “failed to meet,” “met,” or “exceeded” expectations are reported in the table below. One hundred percent of the candidates scored at the “met” or “exceeded” level for all dimensions.
Program Year 2011N=5
Not Meet / Meets / Exceed
Research methods provided detail in description of (if applicable): subjects, design/approach, methods/procedures, and analyses. / 0%
(0/5) / 100%
(5/5) / 0%
(0/5)
Research method and analyses are appropriate to the research questions. / 0%
(0/5) / 100%
(5/5) / 0%
(0/5)
Selects, articulates, and justifies the significance of a research problem. / 0%
(0/5) / 100%
(5/5) / 0%
(0/5)
Describes the relationship between the research problem and the student’s role as an educational leader. / 0%
(0/5) / 100%
(5/5) / 0%
(0/5)
Provides a preliminary literature review that describes prior conceptual and research investigations of the research problem. / 0%
(0/5) / 100%
(5/5) / 0%
(0/5)
Proposal Defense 2012:
Program Year 2012N=5
Not Meet / Meets / Exceed
A research problem which is clear, articulated and significant. / 0%
0/5 / 80%
4/5 / 20%
1/5
Research methods which provide detailed description of (if applicable): subjects, design/approach, methods/procedures and analyses. / 20%
1/5 / 60%
3/5 / 20%
1/5
Research methods and analyses that are appropriate to the research questions. / 0%
0/5 / 60%
3/5 / 40%
2/5
A relationship between the research problem and the student’s role as an educational leader. / 0%
0/5 / 40%
2/5 / 60%
3/5
A preliminary literature review that describes prior conceptual and research investigations of the research problem. / 0%
0/5 / 40%
2/5 / 60%
3/5
Dissertation Defense: The percentages of students who “failed to meet,” “met,” or “exceeded” expectations are reported in the table below. One hundred percent of the students scored at the “met” or “exceeded” level for all dimensions.
Program Year 2011N=8
Not Meet / Meets / Exceed
All pertinent results reported and in clear and concise manner. Table/figures are labeled appropriately and included legend. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Analysis is comprehensive, complete, sophisticated, and convincing. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Develops clear and appropriate research questions or hypotheses that guide the study. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Demonstrates how those research questions or hypotheses have been examined in previous studies. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Draws clear conclusions based on the collected data that answer the research questions or test the hypotheses. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Makes recommendations for further research that can build on this project. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Provides a reflection of the problems or errors in the study and discuss how they could be avoided in subsequent studies. / 0%
(0/8) / 100%
(8/8) / 0%
(0/8)
Dissertation Defense 2012:
Program Year 2012N=5
Not Meet / Meets / Exceed
Develops clear and appropriate research questions or hypotheses that guide the study. / 0%
0/5 / 80%
4/5 / 20%
1/5
Demonstrates how research questions or hypotheses have been examined in previous studies. / 20%
1/5 / 40%
2/5 / 40%
2/5
Analysis is comprehensive, complete, sophisticated, and convincing. / 20%
1/5 / 40%
2/5 / 40%
2/5
All pertinent results reported in clear and concise manner. Table/figures are labeled appropriately. / 20%
1/5 / 40%
2/5 / 40%
2/5
Draws clear conclusions based on collected data that answers research question or test the hypotheses. / 20%
1/5 / 60%
3/5 / 20%
1/5
Makes recommendations for further research that can build on this project. / 0%
0/5 / 80%
4/5 / 20%
1/5
Provides reflection of problems or errors in the study and discusses how they could be avoided in subsequent studies. / 0%
0/5 / 80%
4/5 / 20%
1/5
Plans for 2013-14: Based upon the 2012 assessment data included in this annual report, what changes/improvements willthe programimplement during the next academic yearto improve performance on this student learning outcome?
As studied and discussed at the Doctoral Advisory Committee, there was general positive assessment of the new scoring rubrics. We believed we needed to change, we have changed, and now we need to assess if the changes were helpful to our students. Early indications are positive.
There was considerable attention given to the logistics and mechanics of rubric usage in those cases wherein student remediation is necessary. It was decided that the rubrics would record student performance at the conclusion of the Qualifying Exam, Proposal Defense, and Dissertation Defense and that the overall pass/fail committee decision would occur after any needed remediation took place.
Assessment Lead’s Comments on Student Learning Outcome 1:
This will be completed summer of 2013 by the Assessment Coordinator.
Student Learning Outcome 2
(knowledge, skill or ability to be assessed)
Candidates for other school professions demonstrate professional behaviors consistent with fairness and the belief that all students can learn, including creating caring, supportive learning environments, encouraging student-directed learning, and making adjustments to their own professional dispositions when necessary.
Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.
Implemented new scoring rubrics for student learning outcomes. The new rubrics are the Intern Summary Evaluation. No changes were needed in effectiveness measures, methodology or performance outcomes.
Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.
Intern Summary Evaluation instrument is used during the candidates’ internship process. There are seven domains on which students/candidates are scored. The professional behaviors are scored on a 3-point scale (Expectations Not Demonstrated-1; Developing-2; and Proficient-3). Domains include Strategic Leadership, Instructional Leadership, Cultural Leadership, Human Resource Leadership, Managerial Leadership, External Development Leadership, Micro-political Leadership.
Additionally, all students/candidates must take the Collaborative Institutional Training Initiative course in the protection of human research subjects and must participate in a tutorial which assesses their knowledge of the procedures for protecting human research subjects and conducting research in the social, educational, and behavioral sciences. All students/candidates must answer at least 80% of the tutorial assessment instrument, which is imbedded within the instrument itself, correctly before they can conduct research at UNC Charlotte.
Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.
During Internship: Students/Candidates are scored by two supervisors (i.e., the University Supervisor and Intern Site Mentor) on the Intern Summary Evaluation instrument during the internship experiences (Parts I and II). At the end of the internship, the two raters provide a summative evaluation based on the ratings.
Assessments (i.e.,Intern Summary Evaluation and Collaborative Institutional Training Initiativeexam) are administered at identified points during the program. The tutorial assessment is embedded in the training session and the user is not permitted to proceed until mastery is attained.
Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Disaggregated findings are disseminated to faculty, reported at the program and College level, and discussed at monthly Doctoral Advisory Committee meetings and yearly during faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.
Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)
At least 80% of students/candidates will score “3” (Proficient) across all domains on the Intern summary Evaluation instrument and 100% must pass (score 80% or higher) on the Collaborative Institutional Training Initiative tutorial in order to conduct research at UNC Charlotte.
2011Assessment Data / 2012 Assessment Data
Professional Domains – Internship Parts 1 & 2—the percentage of students who did not meet, meet, or exceeded expectations are reported in the table below. One hundred percent of the students scored at the meet or exceeds level for all dimensions.