Dear Task Force Members

Tags

Dear Task Force Members

Dear Task Force Members,

My apologies for joining this discussion so late. I’m still digging out from the mound of work that accumulated while I was ill.

I have three basic reactions to the report sent to us by Bob Forrester on January 10, 2003.

  1. The report is very useful in getting the Task Force started in its activities. It identifies many of the major issues relevant to the UHM research enterprise and puts forth interesting, preliminary analysis of them. In particular, the five broad categories of issues identified on p. 3 are well formulated.
  1. The report is not the Task Force’s report. The Task Force only met once, on December 20, 2002 for less than 90 minutes. At the initial meeting we verbally commented on a long and very useful powerpoint presentation prepared by Bob Forrester which was provided to the committee approximately a week prior to our meeting. At the end of this meeting, Bob asked us to provide written commentary on his powerpoint presentation, and four of the five members of the task force provided commentary on a variety of issues. Bob then sent a draft task force report to task force members on January 10, 2003. On each page, the report ascribes numerous views to the task force; it extensively cites the e-mail commentary provided to Bob by task force members between December 21, 2002 and January 9, 2003; and it comes to specific conclusions and policy recommendations. While the report is very useful to the task force, it cannot be considered a draft task force report for several reasons.
  • The task force met just once.
  • The task force did not extensively deliberate on any issues.
  • The task force did not deliberate on how it would organize; conduct its business; identify priorities and policies for the UH research enterprise; and assemble draft and final task force reports.
  • The task force consultant has only met briefly with UH researchers and administrators—just from November 12-15, and from December 12-13, 2002—six days of interviews.
  • The task force did not commission or conduct any research to investigate any of the “conclusions” stated in the report.
  • The task force members have not had a chance to consider the role of the consultant in the work of the task force.
  • Most of the e-mail comments cited in the report are casual comments rather than the result of careful and measured deliberations and research.
  • The task force report heavily cites my report on the University as a Catalyst for Economic Development—but no other member of the Task Force has seen the report.
  • The task force report discusses the business plan that SOEST put together to justify constructing more research facilities at UHM—but the task force members have not explicitly reviewed or discussed this plan.
  • The task force members have not considered or discussed the report by the UH Task Force on Re-structuring the Administration and Infrastructure of Research (July 21-November 8, 2001).

The members of the task force must meet soon to address these questions. At this stage, the task force has not had a central role in molding “its” report in any way. Priorities and policies have been assembled by Mr. Forester, and the task force members have merely provided comments on them rather than directing the scope, scale, and vision of the report.

  1. Even if the task force agrees that the report has identified the important questions to be addressed, the task force must conduct and commission additional research to test its conclusions and policy recommendations regarding the UH research enterprise. For example, the report states (p. 9) that the task force members “recognize that some faculty members might be viewed as instructors, others as researchers, a few as both.” Actually, the University evaluates and treats about 70 percent or more of us as both. As the report later recognizes (p. 13). Or: “There was support for a review of the allocation of state-funded positions in the ORU’s, and further, for ‘larger scale academic triage’ in which UH would ‘shrink bloated, outmoded or unproductive departments by attrition or possibly by absorbing them into other units.’” Support from one member? Two? The entire task force? What constitutes outmoded? Humanities? Whatever discussion of this there was could not have been longer than 2-3 minutes. These two examples are illustrative of the entire report. Its conclusions and recommendations have not been subjected to any types of testing by the task force and are, at this point, pretty much worthless.

The report is premature. It puts words in our mouths. It is not the results of our inquiry or analysis. It should not be presented to the Chancellor.

Those points made, here are some specific comments on the report.

  1. The slide showing the many different measures of university support by the state. Just because there are lots of potential measures of support doesn’t mean that they are all equally good. One must first ask what question is being asked—and if the question is too general, to clarify it. Usually when this question of state support is being asked, it’s important to scale the response to the size of the state. That rules out total appropriations for education (the measure in the first column). Higher education as a percentage of state tax revenue (column 4) is highly flawed, as some states fund K-12 education (Hawaii) and others don’t. Higher education expenditures per $1,000 or income (column 3) is more useful, as it scales expenditures by capacity to pay; but this yields strange comparisons in which poor states which have poor higher education systems look as good as rich states with excellent higher education systems. The only measure that has some relevance is higher education support per capita (column 2). This measure is also flawed, as it appears to use undeflated data, i.e., it doesn’t adjust for price differences across states. What we want to know is how much higher education each taxpayer/citizen is able to buy. Undeflated expenditures give erroneous results for states with particularly high/low price levels—read Hawaii.
  1. The slide showing that UH does more state and local research than other universities. This is particularly interesting because the state restricts its overhead rate to 5 percent. Thus, the state appears to be unwilling to provide appropriate research facilities for its own research activities!
  1. p. 9. “Members point out that an investment in strengths will ‘build momentum for other programs where University is not so strong.’” Interesting idea—is it correct? Has it worked at other universities? How exactly would it work? Peer pressure? Taxes on the good departments? Increases in overall reputation will raise the boats of low-quality departments?
  1. p. 7. Establishing a URAC to encourage dual-use technologies. This section discusses the benefits of establishing URAC, but what are the costs and obstacles?
  1. p. 3, bottom page. What do you mean by “revenue per UH student”?
  1. p. 4. “Many of the federal agencies have identified research priorities that are interdisciplinary.” Good point. So how does this relate to picking programs to get additional funds?
  1. p. 5. I have only a vague idea of what it means to “build from the peaks to the stars.” This is a metaphor, not a policy.
  1. p. 5. “Are there promising areas of natural strength in which the University does not have a significant position?” Excellent question. But the task force only spent about 90 seconds on this question. The answers—Ecology, Evolution, and Conservation Biology; cancer, applied defense research, and expansion of social [sciences] research—are all interesting. But no systematic look at University programs has been conducted yet.
  1. p. 7. “Is there appropriate coordination with the state’s economic plans?” This is an excellent question, and the recommendation that the mechanisms for coordination among various actors need to be strengthened is right on the mark. But how to do this is not easy. It’s not clear that the state has communicated its priorities very well to UHM. When it has issued reports, they tend to emphasize many, many priorities.
  1. p. 8. The idea that the plans for economic development should come from the units themselves is a great idea.
  1. The report pays little attention to the synergies or interaction between instruction, research, and economic development. This may be because of its initial assumption that the two endeavors were essentially separable. In the majority of cases, not true.
  1. p. 8. I’m unclear exactly what is meant by the dollar density of research? If it is research dollars per square foot of research space, it wouldn’t mean very much. Variation in research density across campus will vary by the mix of research endeavors and how space intensive research needs to be.
  1. “The Plan”—to build new buildings and hope that we can attract star researchers to fill them—has some currency with the new medical school, but is a highly flawed venture for the rest of the campus. John Learned’s comments are on the mark here and I will not repeat them. The basic point is that a full package of housing, salary, colleagues, teaching load, space, travel, etc needs to be in place to attract good people. And the emphasis on growing the research enterprise must be centered on attracting good people. Not building facilities.
  1. Increasing the UHM overhead rate is absolutely vital. The reason is that the rate is closely tied to the University’s support for research activities. The low overhead rate shows that the University is not adequately supporting research activities. It is absolutely vital that UHM support of research be increased immediately. As a side-effect, the overhead rate will rise!
  1. p. 11. Research is not “always a loser.” This is because it is often not a “stand-alone” business line. You cannot evaluate a single product line in a multi-product firm for profitability unless you can adequately assign costs to each line. And universities do that very poorly. (And sometimes it can’t be done when there are joint costs.)
  1. “The Plan” has never been carefully analyzed with respect to its specific assumptions. For example, does it use correct discount rates? Is the length of time assigned to building depreciation realistic? Does it account for taxes correctly in its revenue/cost streams? How is the risk associated with each research enterprise considered in the Plan? Neither this report nor the consultant’s powerpoint has adequately addressed the nuts/bolts of this plan.
  1. p. 14. Any evaluations in yet on how well the partnership funds have worked? (I haven’t seen any.) They are probably a good idea as they involve matching contributions by private firms—a sign that they are serious about the RD projects.
  1. p. 14. The position of VP for Research seems to take on a central role. Much of what we are considering in this report is policy, not just execution.
  1. p. 15 “Should there be an external review?” Supposedly, the task force believes “that the University has an obligation to assure that investments are being made in areas where Hawaii has an advantage.” But what does this mean? We didn’t have an advantage in cloning mice. Or discovering the properties of neutrinos. Or studying the Japanese economy. This statement misses the fact that UH has the potential to attract clusters of researchers based on its climate, diverse population, casual lifestyles, etc.

The next steps (pp. 15-16) are promising. But most vital is that the task force take a step back and examine the report step-by-step. It is important the task force write the next draft of this report.

Despite these criticisms, Bob has accomplished a great deal in developing the powerpoint and this report. But the report should not leave this task force. We should write our own report. Even if it is only used to update the Chancellor.