January 06 rev. July 10

RadfordUniversity

Professional Education Preparation Programs

CANDIDATE AND PROGRAM ASSESSMENT REPORTS

  1. Title page (name of program, date report is submitted, name and contact information for the individual submitting the report).
  1. Candidate Performance Assessment Framework: chart we have been using showing key assessment points and listing the assessments for Content Knowledge, Content Pedagogical Knowledge, Professional Knowledge and Skills, Dispositions/Characteristics, and Impact on Student Learning
  1. Section II Chart: List of assessments: Name of assessment, the type of assessment (e.g., standardized test, performance assessment) and when administered (see attachment).
  1. Section III Chart: Relationship of assessments to standards(see attachment).
  1. Section IV (see attachments):
  2. description of each assessment 1 through 8:
  3. Description of assessment and its use in the program
  4. How it is aligned with the standards
  5. assessment tool/scoring guide
  6. candidate data derived from the assessment for at least two successive administrations of the assessment; when possible, report the number and percentage of students evaluated as performing at the different performance levels.
  7. analysis and interpretation of the assessment results (see attachment).
  1. Section V: Description of how the program has used results of all assessments to improve candidate and program performance related to the following areas (see attachments):
  2. Content knowledge
  3. Professional and pedagogical content knowledge, skills, dispositions
  4. Impact on student learning
  5. Other changes or plans for changes (e.g., curriculum changes, changes to policies, changes in procedures for recruiting and advising students)

ACCESSING SOURCES OF EVIDENCE

FOR THE SIX – EIGHT COMMON ASSESSMENTS

Each program has identified six to eight (ECE/ECSE has more) assessments which address candidates’ content knowledge, professional knowledge and skills, impact on student learning, and professional dispositions. Faculty members can access data needed for writing theAnnual Program and Candidate Assessment Reportthrough their program materials on rGrade or through the College’s Assessment site ( under “view reports.” Faculty will report the data for at least two administrations of each of the 6-8 key assessments used in their program, and will analyze and discuss the data in both Sections IV and V (see attachments).

  1. Content Knowledge: Licensure subject matter assessments (Praxis II). Program Completer pass rates on Praxis II are 100% but we need to monitor applicants’ success in taking and passing the exams. For most programs, this would mean evaluating how many applicants are not admitted to student teaching or to the Teacher Education Program because they cannot pass Praxis II. You can find this information in Dr. Smolova’sreport Summary of Attrition Behavior by Program Area (2710 through 2920)and in other reports available on the College’s Assessment siteunder “View Reports.” The number and percent of applicants to the Teacher Education Program passing Praxis I and Praxis II for Fall 2005 through Fall 2009 can also be found there under “View Evaluation Data.”
  1. Content Knowledge: Examples include: review of applicants, comprehensive exams, portfolio tasks, evaluations by clinical supervisors and cooperating professionals, and in some instances, grades in courses, etc.
  2. Teacher Preparation Programs:Departmental Review of Applicants is included as a program-based assessment in teacher preparation programs and can be found in rGrade.
  3. Teacher Preparation Programs: there is a content knowledge assessment included in the evaluation of interns completed by cooperating professionals and university supervisors. For most programs, this would appear as category VI in the Student Teaching Evaluation and Early Field Experience Evaluations. These are accessible in your rGrade materials.
  1. Planning: Examples of the types assessments of candidates’ ability to plan include:
  2. A teacher candidate’s ability to plan instruction based on information about students, the curriculum and SOL’s, the community, etc.; (evaluations of lesson plans, unit plans, work sample, etc.)
  3. A school leader’s ability to develop school improvement plans based on studies or data;
  4. A special educator’s ability to develop Individual Education Plans;
  5. A speech language pathologist’s ability to diagnose speech and develop an intervention plan.
  6. Programs will have a program-specific assessment of candidates’ ability to plan which can be accessed in rGrade.
  1. Instruction/Implementation: a primary example would be supervisors’ and cooperating professionals’ evaluations of interns’ performance during clinical experiences.
  2. Teacher Preparation Programs: Evaluations of Interns for Early Field Experience and Student Teaching are available in rGrade. Programs for other school personnel should have clinical evaluations of interns available in rGrade.
  1. Impact on student learning and on the learning environment: Examples include: Teacher candidates conduct pre- and post- assessments of PreK-12 students, summarize and interpret the results, and discuss how they can use the results to improve student learning and their teaching; candidates for other school roles demonstrate the ability to collect, compile, analyze and interpret data which can guide their practice in improving student learning and school environments for learning.
  2. The assessments for this area would be specific to individual programs and should be available in your rGrade materials.
  3. Teacher Preparation Programs: candidates complete a survey during Assessment Day on Impact on Student Learning and on Parental Involvement. The survey includes data on both of these topics and is available, with results disaggregated by program area, on the College Assessment site (
  1. The #6 assessment varies by program. We suggest programs include an assessment of Professional characteristics/dispositions. This is needed for the NCATE Institutional Report.
  2. Teacher Preparation Programs: The Evaluations of Student Teaching Interns and Early Field Experience Interns include a category on Professional Characteristics and Dispositions and can be accessed through rGrade.
  1. The #7 assessment varies by program. We suggest programs include results of Employer surveys/Graduate follow up surveys).
  2. Results of a university-wide employer survey, disaggregated for the college, is posted on the College Assessment site.
  3. Results of a 2005 Alumni report (2000-2005) are posted on the College Assessment site.
  4. The Director of College Assessment is conducting an Employer Survey during July, August, and September 2010. Results will be shared.
  5. If your program has conducted employer or alumni/graduate surveys, please send the survey instrument and the results to Alona and copy me.
  1. Optional assessment in most programs.

Page 1

January 06 rev. July 10

SECTION II— LIST OF ASSESSMENTS

In this section, list the 6-8 assessments that are being submitted as evidence for meeting state and national standards. All programs must provide a minimum of seven assessments. If your state does not require a state licensure test in the content area, you must substitute an assessment that documents candidate attainment of content knowledge in #1 below. For each assessment, indicate the type or form of the assessment and when it is administered in the program.

Name of Assessment[1] / Type or
Form of Assessment[2] / When the Assessment
Is Administered[3]
1 / [Licensure assessment, or other content-based assessment]
2 / [Assessment of content knowledge]
3 / [Assessment of candidate ability to plan, programs, interventions, etc.]
4 / [Assessment of candidate’s ability to implement—e.g., student teaching, clinical experiences, etc. ]
5 / [Assessment of impact on student learning]
6 / [Assessment of candidates’ professional characteristics and dispositions]
7 / [Alumni and employer surveys]
8 / [Additional assessment related to standards]

SECTION III—RELATIONSHIP OF ASSESSMENT TO STANDARDS

Enter the national or state standard on the chart below, identify the assessment(s) in Section II that address the standard. One assessment may apply to multiplestandards.

Title of Standards (e.g., NCTM Standards or VA Standards for Music Education) / APPLICABLE ASSESSMENTS FROM SECTION II
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
□#1 □#2 □#3 □#4
□#5 □#6 □#7 □#8
Page 1

January 06 rev. July 10

SECTION IV—EVIDENCE FOR MEETING STANDARDS

At this time, a key goal is to collect the following information for each of the 6-8 assessments listed above:

1. a description of the assessment and how it aligns with the standards.

2. the evaluation tool for the assessment (aligned with standards)

3. results of this particular assessment.

4. brief discussion of results of this particular assessment (if there are results).

DIRECTIONS: The 6-8 key assessments listed in Section IImust be documented and discussed in Section IV. The assessments must be those that all candidates in the program are required to complete and should be used by the program to determine candidate proficiencies as expected in the program standards. In the description of each assessment below, the SPA has identified potential assessments that would be appropriate. Assessments have been organized into the following three areas that are addressed in NCATE’s unit standard 1:

  • Content knowledge[4]
  • Pedagogical and professional knowledge, skills and dispositions
  • Focus on student learning

For each assessment, the evidence for meeting standards should include the following information:

1. Abrief description of the assessment and its use in the program (one sentence may be sufficient);

2. A description of how this assessment specifically aligns with the standards it is cited for in Section III.

3. A brief analysis of the data findings;

4. An interpretation of how that data provides evidence for meeting standards; and

5. Attachment of assessment documentation, including[5]:

(a) the assessment tool or description of the assignment;

(b) the scoring guide for the assessment; and

(c) candidate data derived from the assessment.

The narrative section for each assessment (1-4 above) is limited to two text pages. It is preferred that each attachment for a specific assessment (5a-c above) be limited to the equivalent of five text pages, however in some cases assessment instruments or scoring guides may go beyond 5 pages.

SECTION V—USE OF ASSESSMENT RESULTS TO IMPROVE

CANDIDATE AND PROGRAM PERFORMANCE

In contrast to the discussion of the results of each individual assessment, this section looks at the broad picture and the faculty’s interpretations of the assessment results overall, and how this information is used to improve candidate performance and programs.

Evidence must be presented in this section that assessment results have been analyzed andhave been or will be used to improve candidate performance and strengthen the program. This description should not link improvements to individual assessments but, rather, it should summarize principal findings from the evidence, the faculty’s interpretation of those findings, and changes made in (or planned for) the program as a result. Describe the steps program faculty has taken to use information from assessments for improvement of both candidate performance and the program. This information should be organized around (1) content knowledge, (2) professional and pedagogical knowledge, skill, and dispositions, and (3) student learning.

(response limited to 3 pages)

In some disciplines, content knowledge may include or be inextricable from professional knowledge. If this is the case, assessments that combine content and professional knowledge may be considered “content knowledge” assessments for the purpose of this report.

All three components of the assessment – as identified in 5a-c – must be attached, with the following exceptions: (a) the assessment tool and scoring guide are not required for reporting state licensure data, and (b) for some assessments, data may not yet be available.

Page 1

[1] Identify assessment by title used in the program; refer to Section IV for further information on appropriate assessment to include.

[2] Identify the type of assessment (e.g., essay, case study, project, comprehensive exam, reflection, state licensure test, portfolio).

[3] Indicate the point in the program when the assessment is administered (e.g., admission to the program, admission to student teaching/internship, required courses [specify course title and numbers], or completion of the program).

[4] In some disciplines, content knowledge may include or be inextricable from professional knowledge. If this is the case, assessments that combine content and professional knowledge may be considered “content knowledge” assessments for the purpose of this report.

[5] All three components of the assessment – as identified in 5a-c – must be attached, with the following exceptions: (a) the assessment tool and scoring guide are not required for reporting state licensure data, and (b) for some assessments, data may not yet be available.