Executive Director’s Recommendation

University of Phoenix

January 2012

Executive Director’s

Recommendations for Proposed Resolutions

University of Phoenix – Oregon Campus

On-Site Program Review

Educator Licensure Program

September 9-10, 2011

Introduction

On September 9-10, 2011, a TSPC on-site team reviewed the University of Phoenix’s Oregon-approved educator licensure program pursuant to OAR’s Division 10, to evaluate compliance with Teacher Standards and Practices Commission’s established standards for program quality in Divisions 17 and 65. Division 10 governs the procedures for site visit reviews and recommendations to the Commission. Divisions 17 and 65 have the standards for program review.

Following the review of the evidence, the on-site team found seven standards to be unmet:

·  584-017-0025 Evaluating & Verifying Candidate Competency

·  584-017-0050 Resources

·  584-017-0055 Practica and Student Teaching

·  584-017-0057 Internship Agreements

·  584-017-0060 Unit Personnel for the Program

·  584-017-0182 Intern Experience for Teachers

·  584-017-0185 Evidence of Effectiveness

TSPC received the University of Phoenix rejoinder on December 20, 2011.

Analysis of Unmet Standards

The portion of standards which were found by the site visit team to be “unmet” are shaded below.

1.  584-017-0025 Evaluating and Verifying Candidate Competency X Unmet

The unit has filed with TSPC a plan which has been reviewed by the consortium for assessing the competence of each candidate for licensure.

(1) The plan includes both formative and summative assessments of competencies.

(2) The plan outlines procedures, criteria, and timelines for the assessments.

(3) The unit summarizes and analyzes assessment results with the results going to the consortium for recommendation.

1 3.4a

January 20, 2012

Executive Director’s Recommendation

University of Phoenix

January 2012

Conclusion of the Team: The assessment plan for verifying candidate competency does include both formative and summative assessments. These can be seen in evidence from courses, evaluations of teacher work samples, and observations completed during clinical experiences. Procedures, criteria, and timelines are generally complete, however some scoring rubrics were difficult to locate, and program candidates indicated that use by faculty is somewhat uneven. (emphasis added)

The program does not appear to summarize and analyze assessment data on candidate competency for presentation to the Consortium. The team found no evidence in Consortium minutes or presentation notes to indicate this occurs at regular meetings. Since a number of the Consortium members are also program faculty, they were unsure if results were presented at Consortium meetings or possibly at other meetings. Consortium members did not have opportunity to make recommendations for the program based on candidate data. (Evaluation Report p. 7)

Evidence used for decision: Assessments of candidate competence in TaskStream; Forms for assessment of teacher work samples and clinical practice; Consortium minutes beginning in 2005; Conversations with faculty, Consortium members, and candidates. (Evaluation Report p. 7)

Institutional Report (IR) information (in pertinent part) (May 2011) [IR, pp. 23-24 & 27]: In response to subsection (2) of the rule: (There is not anything specifically on point in the IR regarding irregular application of the assessment rubrics or lack of ability for the site review team to locate rubrics.)

As it applies to subsection (3) of the rule [IR pp.23-24 & 27]: Each campus employs content area chairs who also serve as full-time faculty. These area chairs oversee a content area committee composed of faculty that meets regularly to review programmatic, faculty, and student issues related to a specific program. These committees share best practices, instructional and assessment strategies, and updates regarding program policies and procedures. Committees also review candidate assessment data and submit critical analysis results to the unit as part of the continuous improvement process. (emphasis added) The unit compiles these analyses to guide Faculty Council and unit staff decisions on programmatic, assessment, and curricular decisions…

…Local advisory boards are utilized by campuses to identify, review, and evaluate modifications to programs based on changing district and state needs, recruitment of potential faculty, program policies and procedures, integral contacts and relationships in the state educational community, and relationship-building in local school districts. The advisory board is comprised of individuals from the local community including P-12 classroom teachers and administrators, school district personnel, state officials involved in teacher certification, current education students or alumni, current education faculty, and community or business leaders….

[IR, p. 27] Whose responsibility is it to summarize and analyze the data?

Unit staff members summarize and analyze the data, in particular the Dean, Associate Deans, Assistant Deans, and Curriculum Directors. This analysis is in consultation with assigned staff members from the Office of Learning Assessment who guide the process and provide general assistance with statistical analysis and interpretation.

Institutional Response (Rejoinder) to Site Review Evaluation Report dated December 20, 2011[p.2]:

To assist in a uniform assessment progress of candidates, the University of Phoenix utilizes an electronic portfolio, TaskStream. This allows the unit to collect a wide variety of benchmark assessments throughout the program. Standards-based rubrics are used for assessment and to provide the candidate with feedback on his/her progress in meeting standards and proficiencies (see Appendix A- Standardized Rubrics). Assessments were selected to represent a range of opportunities for candidates to demonstrate attainment of all program proficiencies.

Executive Director’s Discussion: (Subsection 2): The site review team found that subsection (2) of the rule was “met” in that the program’s assessment plan outlines procedures, criteria, and timelines for the assessments. However, candidate interviews revealed that the assessment rubrics were not evenly applied by faculty and the team had difficulty locating some of the rubrics. In response to this finding, the university went to great lengths to show they had rubrics for each of their assessment areas (content and pedagogy). [These data are provided in appendices A, pp. 31 through 110 of the university’s rejoinder.]

However, the issue was not whether the rubrics existed, the issue is the team noted that some scoring rubrics were “difficult to locate.” Why some rubrics were difficult to locate was not addressed in the rejoinder.

In addition, the team learned through candidate interviews that use or application of the rubrics was “uneven,” indicating an inconsistent application of the detailed assessment plan to each candidate. The uneven application of rubrics by faculty with candidates was also not addressed in the unit’s rejoinder. The rejoinder seems to suggest that the use of the software program “TaskStream” ensures that the assessment plan is uniformly applied to all candidates by all faculty.

Finding: The unit needs to address these two issues more clearly (the lack of ability to find the rubrics during the visit and the uneven application of the rubrics). The Rejoinder is incomplete and nonresponsive with regard to monitoring consistent application of the unit’s assessment rubrics.

Subsection (3): This subsection provides: The unit summarizes and analyzes assessment results with the results going to the consortium for recommendation (as it relates to candidate assessment).

Similarly with the extensive disclosure of the actual rubrics used, the unit provided the flowcharts and procedures for data analysis. It is clear that the unit has a well thought-out and systematic process for analyzing candidate data. But, the unit’s own words indicate that the data analysis occurs primarily with university faculty, not in collaboration with the consortium. The rejoinder refers to an Assessment Calendar, but the rejoinder does not indicate whether the Assessment Calendar existed at the time of the site visit or whether it was created after the site visit. Again, the value/use of the calendar is not clear as it relates to the issues raised by the site visit team.

The unit did not indicate in the rejoinder how they intended to remedy the problem with not having records or evidence that candidate assessment data is being shared with the consortium. But sharing data with the consortium is not the only issue in this rule; providing data for the consortium to analyze and recommend changes is also important.

While the issue of consortium membership is not a specific part of this particular standard, the fact that the consortium is made up primarily of the unit’s graduates and faculty is notable. In other words, the unit must take steps to solicit outside input with regard to candidate performance and how that relates to the program’s overall quality.

Finding: The unit did not provide any information in the rejoinder with regard to how it intends to regularly submit candidate performance data to the consortium for feedback and evaluation.

2.  584-017-0050 Resources X Unmet

The unit provides resources necessary to assure effectiveness and continuity of the program.

(1) The unit provides financial support for the program to include facilities, equipment, technology, support personnel and other resources.

(2) The unit provides support for professional development of faculty and school-based supervisors.

(3) The unit has a written agreement from each school district that provides field sites.

Conclusion of the Team: The unit does not provide the resources necessary to assure effectiveness and continuity of the program at the Tigard Oregon campus. The Tigard campus is set up as (an) independent budget unit within the larger University of Phoenix. All programs and support services on the campus, including the education program is funded by the (Oregon) campus. This means all resources for facilities, equipment, technologies (sic), faculty and support personnel are funded by the Tigard campus.

Budgets are developed based on revenue projections for the campus. Student enrollment in the program is just one factor considered in the development of program budgets. The Tigard education program has three dedicated staff members; the program director, field placement coordinator, and a program and data manager. (Other) Faculty members are considered independent contractors. The program has two faculty members who serve in the capacity of content area chairs. Staffing for enrollment and financial aid advising are provided by the campus as (part of the campus’s) general services.

Professional development is available to all faculty and staff. Funding for professional development is provided by the Tigard campus upon request. Faculty were aware of funding available for professional development, but also indicated it was not required. (emphasis added)

The unit has a written agreement from each school district that provides field sites.

Candidates in the program rate the advising as good to excellent. The team did not find any evidence education program candidates experienced the advising as noted in the securities lawsuit filed by the Oregon Attorney General’s Office.

The team noted inadequate facilities related to education technology. The unit did have one classroom with a Smart board. A significant number of candidates and graduates stated they did not have an opportunity to learn and use nor could faculty model best practices on the use and integration of technology in instruction.

Evidence used for decision:

Interviews

Institutional Report

Inventory of educational classrooms

Institutional Response (Rejoinder) to Site Review Evaluation Report dated December 20, 2011[pp. 4-13]: Subsection (1): Technology: The rejoinder cites example after example of where the integration of technology is included in the syllabi for nearly all classes. Additionally, the rejoinder also reviewed the availability of technology for all students on the Oregon classroom, e.g., a campus-wide computer center, advanced equipment in the classrooms, etc.

It is notable that the rejoinder mentions a faculty meeting on December 7, 2011 eight weeks after the site visit to focus exclusively on the issues raised in the site visit report. The meeting resulted in a variety of possible next steps. [p. 13]

Executive Director’s Discussion: The Institutional Report and the unit’s Rejoinder both cite multiple opportunities for the use of technology both as it relates to integration into the candidate’s preparation and faculty’s access to technological tools for teaching. Additionally, the unit cites multiple examples of how candidates must access information for the program through established software or online resources. However, it appears, even though the integration of technology has been inserted into the syllabi offered as examples, too many candidates and graduates cited the lack of opportunities to integrate technology into their class work and few examples of faculty modeling the use of technology. The fact that the Oregon campus runs a small program and nearly all faculty are adjunct independent contractors may be the reason candidates and former candidates raised the issue of lack of technology integration. The fact that the December 7, 2011 faculty meeting cited in the rejoinder netted several things that could be done, reflects an acknowledgement that these instructional best practices were not occurring. Additionally, other than citing the solutions proposed for integrating technology at the December 7, 2011, the rejoinder did not address a systematic approach for ensuring that these proposals would be fully implemented and monitored. Finally, after citing all the available resources, the unit then notes it had a focused meeting regarding the use of technology, apparently acknowledging that while they believed they had provided ample opportunities for faculty and candidates to use and integrate technology into the curriculum, there isn’t a system for monitoring and ensuring that integration of technology is actually occurring.

Finding: The unit held a faculty meeting regarding the integration of technology, but did not propose to the Commission how it plans to ensure these practices will be implemented and monitored. The unit assumes because the statements related to use of technology are in some course syllabi and technology is available at the campus facilities, that faculty and students are purposefully accessing and learning how to use technology in the education environment appropriately. The rejoinder was therefore non-responsive to the issue.

3. 584-017-0055 Practica and Student Teaching X Unmet

(1) The unit has a written agreement from each school district/agency that provides practica, student teaching and intern experiences.

(a) The agreement includes a commitment to select and assign qualified supervisors and provide suitable opportunities and adequate financial support for field experiences.

(b) The district agrees to assist the unit in evaluating work samples or portfolios and the success of candidates.

(c) The unit has TSPC approval for each Oregon non-public school in which candidates are to be supervised. Criteria for supervisors must meet the standards for school-based personnel for the program 584-017-0070(2).