December 13, 2007
Summary of Responses of the Academic Colleagues
to the Van Loon Review of OCGS
prepared by Marilyn Rose, Co-Chair
I was asked by the Executive Committee to invite the Academic Colleagues to comment on the Van Loon Review of OCGS. This report was circulated to them subsequent to the Council meeting of November 2, 2007.
A number of the Colleagues reported that they did not feel they could respond adequately because (a) they had not had time to digest the report fully and (b) they really didn't feel informed enough to comment without benefit of a full discussion at one of the Academic Colleague meetings.
Certainly the responses received indicate that there is the potential for the Colleagues to offer a great deal of useful comment on the report if given time to process and discuss the document more fully. One of the Colleagues made this point very clearly:

"As I am in the middle of grading final exams, I regret that I do not have time today for a detailed response to the Van Loon proposal regarding OCGS. But then, that may be the most important comment that I can make at this time. It is unwise to make hasty decisions of this magnitude without the input of the academic community, especially those most knowledgeable about OCGS and the review processes for both graduate and undergraduate programs. And yet this is one of the busiest times of year for those of us still in the classroom. Please urge patience, and deferral of any binding decisions until after we have had time to discuss the proposal and its implications."

What follows is a compilation of comments that were sent to me, as late as this evening, gathered under headings that should make them easier to digest. I have paraphrased at times, quoted quite directly at other times, combined comments by different individuals under specific headings where that seemed appropriate, and edited for length. I hope my colleagues will feel that I have fairly transcribed their thoughts: I have done my best to be true to the language and the spirit of the originals.
The result is not an elegant document. In fact it probably has all the grace of a camel -- i.e. a horse designed by a committee. But it is the best that I could pull together in the time available.
1. Process:

•On p. 40 of the report it is stated that "The mission of the Ontario Universities Degree Quality Assurance System should be debated by the Council of Ontario Universities," and on p.41 that "These provisions of the draft constitution should be approved by COU in order to give effect to the mandate of OCAV to manage the system."

•The question then raised is whether the Executive Heads' deliberations on the 13th will constitute the approvals referred to in the report, or whether the approval on the 13th is "approval in principle," such that the detailed approvals will follow later in the process and involve full council.

2. Over-emphasis on Reporting Relationships and Organizational Structures:

•The terms of reference for the review comprised 16 items. The report talks almost exclusively about one of them: #16, which is about reporting relationships. The report in practical effect leaves most of the work of addressing the other terms of reference to a new Executive Director of Quality Assurance. Many of the problems that the other terms of reference imply arose from the practice of leaving such issues to the OCGS Secretariat, so there is little obvious progress.

•Other than calling for “streamlined” guidelines and internal periodic reviews, the Van Loon report does not really discuss process. Instead, the report is essentially about organization: who reports what to whom. The report calls for guidelines that each university should follow in preparing briefs for new programs, and further says that the HEQACO secretariat should vet the briefs to ensure that the guidelines have been followed. This, presumably, amounts to a restatement of the current OCGS practice.

•There ought to be some simplification and clarification of the existing regulations. For example, the “fields” and “programs” debate should be resolved. The so-called “one-third rule” (Section 10.4.4.) also needs to be clarified because it is frequently misunderstood. But even with such simplifications and clarifications of the status quo, the sum of the old and the new will be too great to be construed as streamlined. Getting new guidelines right is more important than pushing hard for streamlining.

3. The Role of OCAV:

•One concern is that OCAV will not have the time to devote to graduate issues. It meets only 4-5 times per year and Provosts have many areas to oversee. How can they possibly bring the attention to graduate review issues that OCGS has brought to quality assurance and other graduate issues over the years? Even OCGS has a hard time keeping up with the workload and this is its single focus.

•The membership of OCAV will change significantly over time, and the attention paid to grad studies by VPs Academic may wax and wane, depending on the pressing issues of the day. It is appropriate to be concerned about continuity of engagement in grad issues on the part of this busy body (no pun intended).

•The report does say that OCAV should ratify the guidelines, but the workload of OCAV is now such that it is difficult to imagine that body sweating the details of the guidelines. The review committee obviously didn’t want to do it themselves. Maybe there is a role for Academic Colleagues here.

4. Academic Colleague Representation:

•on p.49 there is the provision for an Academic Colleague to sit on the proposed review panel and on p. 52 provision for an Academic Colleague to sit on on the transition Task Force.

•Both of these suggestions are in keeping with the role Colleagues seek to define for themselves on Council and there is definitely individual experience and expertise among colleagues such that they could contribute effectively to these bodies.

•Other comments about the potential for involvement of Academic Colleagues are imbedded in other items in this collective response.

5. Changes to the Undergraduate Review System:

•The proposed introduction of a new undergraduate program assessment process falls outside of RVL’s mandate, undermines the autonomy of Ontario universities who do not have to undergo external reviews at the UG level at present, and will likely eliminate any saving in efficiencies.

•It is difficult to imagine that universities would endorse the introduction of a PEQAB-type of assessment process for new undergraduate programs, thereby forfeiting individual autonomy.

6. Parallelism between Undergraduate and Graduate Reviews:

•At present there is an audit system in place for undergraduate program reviews (UPRAC) and an assessment system (OCGS) for graduate programs. The Report recommends parallelism in that all UG and Graduate program reviews for continuing programs would undergo audit and all new UG and Graduate programs would undergo assessment by an outside body. This is a mixed model that does not make sense: we need to move to an audit or an assessment process, but not a combination of both.

•In arguing that the approaches to assuring quality for graduate programs should be the same as those for undergraduate programs because that is how all other systems are structured, the report is glib. It fails to admit that virtually all other systems are differentiated in some way and by some means other than quality assurance. In other words, there is some kind of third party validation of mission statements. Maybe Ontario should have such validation, but it doesn’t. Nor should the appraisals process – new or old – be the instrument by which missions are validated.

•In practical effect, the OCGS Appraisals process, because it is different from the undergraduate process and because undergraduate missions are very similar, is itself a means of sharpening the definition of quality.

7. Relationship with MTCU:

•There is risk in changing organizational names and structures unnecessarily. As the report correctly points out, the new scheme will require government endorsement. The last thing that we should want is government intervention that might change the long-standing rule that “good quality” entitles funding.

•If changing the organization chart would also significantly improve the actual process of appraisal, the risk might be worth taking. As the report now stands without being specific about the process, the risk seems too great if the only real change is the substitution of OCAV for OCGS.

•Indeed there is no discussion in the report of the relationship between the proposed changes and the way that universities will deal with MTCU under the proposed new regime. This is a serious deficiency.

•One of the main arguments in the Report for changing our independent assessment process toward a greater reliance on audits is the argument that many other jurisdictions rely principally on audits. However, many of these jurisdictions have a greater involvement of government. The OCGS process is unique not only in its form but also in the role that it plays as a buffer institution.

•By bringing graduate reviews under the purview of OCAV, OCGS may lose --or appear to lose --its relative independence. If it loses its role as a buffer body, over time MTCU might feel the need to intervene more directly in grad studies.

•It should be noted that MTCU has not expressed any lack of confidence in the OCGS approval process as a basis for making funding and other decisions. Why would we want to upset that applecart?

8. The Makeup and Mandate of the New Program Committee:

•It is suggested that the committee be made up of “4 to 6 senior academics or academic administrators from Ontario”. This represents a shift away from academic peer review through the inclusion of academic administrators.

•This committee then becomes gatekeeper for the entire Ontario system. It is fair to raise concerns about potential disparities between the treatment of larger universities with a large number of established graduate programs and smaller universities that are expanding their graduate programs. The advantage would almost certainly go to the larger and well-established schools whose history and track records would be likely to sway such a small (and busy) committee to ask fewer questions and require fewer hurdles to be cleared for the established universities than in the case of smaller, aspiring institutions.

9. The Workload of the Proposed New Program Committee:

•If there is to be a New Program Assessment Committee, we will have to ensure that it will not be a bottleneck. OCGS saw over 100 new programs approved last year. An average of 60 new programs (3 for each institution) would net 60 per year. How often will this committee meet and how much review will it be called upon to do?

•The report states that this committee would continue to act like the existing OCGS appraisal committees. But there are 4 (at times as many as 5) Appraisals Committees. How could a single committee handle all of the new program reviews in the province in any given year?

•We will also have to have assurance that this new program committee will fairly represent the entire province and its different kinds of institutions. A committee of 4-6 is surely too few to assure fair representation.

10. The Make-up of the Transition Task Force or Committee:

•This is a critical issue. The report leaves the question of the composition of the implementation committee quite open. But, as with the New Program Committee, it specifies that only one graduate dean would be included.

•John ApSimon made the point in the OCGS meeting with Dr. Van Loon that at least three graduate deans should be on the task force or committee. The graduate deans are the ones who have been responsible for the process throughout the history of OCGS and they are most familiar with quality assurance and the complexities of graduate education in this province and elsewhere.

•If there were to be 3 or 4 graduate deans on these committees, they could be selected to represent the large and small universities.

11. The Mission-Statement Test:

•The report suggests that the system make use of each university's mission statement in the audit process for established programs as these set out "the primary goals of the institution." The report indicates that it would "not be the purpose of the quality assurance process to evaluate the mission statement per se but rather to assess the fit of programs and functions with that mission" (p. 43). Such an assumption is naïve on two counts.

•First, existing mission statements are deliberately vague; they allow room for anything an institution might wish to do. Ontario’s universities are living in an age of opportunism. Their mission statements are far more aspirational than empirical.

•Second, there are no apparent regulatory means of preventing an institution from declaring its mission to be anything it wants it to be. Thus the mission statement, instead of being a benchmark for the “fit” of new graduate programs is really a moving target. This will further complicate the appraisals process instead of improving it. “Fit” is not a measure of quality.

•Moreover, a worry has been expressed by graduate deans from smaller universities that this provision could be used to reinforce a two-tier system with some universities being seen as primarily undergraduate while others are seen as research universities. The argument is that the mission statements of smaller institutions with long histories as undergraduate institutions will reflect that history and those past strengths. At the same time their relatively small graduate programs (and especially their doctoral numbers) will make it difficult to craft mission statements that focus on the preeminence of graduate programming as would be the case with already established research institutions. This would contribute to the sense of a two-tier system in the province with the “tiers” being treated unequally.

•Differences in mission statements have nothing to do with the relative dedication of different schools to graduate program excellence, nor with the commitment of their research-oriented faculty to graduate studies, nor with the general quality of the experience they offer to graduate students.

•Mission statements are historical statements by nature; institutions that are in change mode are difficult to characterize in mission statements -- which are in any case a rather outmoded concept when it comes to describing the polymath nature of today’s universities with their very complex “missions.’

12. The Problem of Fields:

•The report alludes to the problem of differentiating between “programs” and “fields” but does not really resolve it. This is a genuine problem, but all the report says is that the current definitions should be a “starting point.” That is not helpful.

•Consider also that by regulating what the OCGS rules call “advertising” the Appraisals Committee is in effect limiting competition and protecting marginal programs. What the program or field is called sometimes becomes more important than what it is or how it is delivered. The de facto connection of advertising with quality virtually forces institutions to declare fields when they otherwise would not think in those terms functionally.

13. External Consultants.

•The report slides too quickly over the appointment of external consultants by the universities. In my experience on the Appraisals Committee, this is an area in which universities with marginal programs may cut corners and in which the Appraisals Committee plays an important role.

•The report is not clear about the point at which the new Appraisals Committee could exercise judgment about the appointment of consultants. This point should be right at the start because of the impracticality of retroactive correction.

•There is no indication that the Appraisal Review considered the inclusion of external members on the two New Program Appraisals Committees. Given the workloads and frequency of meetings of the current Appraisals Committee, having external members was impractical. But under the recommended arrangement it would be practical, and could improve the process.

•Perhaps as an alternative to external consultants, an Academic Colleague or two should be on these committees.

14. Definition of "Quality"

•The report recommends that the new internal assessment process should, in addition to demonstrating quality, provide assurance about “student academic support, support for disabled students, library services, and registrarial services.”

•Without suggesting that anything on this list is unimportant, it is a fact that none of them – with the possible exception of library services – is about quality as it is understood under the current OCGS appraisals process.

•This is not hypothetical. The Appraisals Committee sometimes sees proposals that blur the definition of quality by discussing corollary factors. It may be that this section of the report was not carefully drafted, but if it implies a new definition of quality, it should receive careful and serious attention now.

•Neither Section 3.3 of the UPRAC guidelines nor Section 6 of the current OCGS appraisal procedures includes items such as those listed in the report’s apparently expanded definition of quality.