From the archives: performance funding incentives without funding

By Dan Lang

The current initiatives of the government of Ontario in the direction of performance funding based on “performance indicators” and the extension of “corridor funding” to colleges are not as new or innovative as they might appear to be for provincial post-secondary policy. The first proposals for corridors and performance funding based on system-wide metrics can be found in the 1983-1984 report of the Commission on the Future Development of the Universities of Ontario, commonly known as the Bovey report after the chair of the commission.

Is “performance funding without funding” the unexplainable oxymoron that it otherwise appears to be? Usually it is, but not when connected to what the commission called “differential corridors,” access to which was contingent on a metric that today would be called a performance indicator, and indeed may be one of the indicators that the government will put in place to coincide with the next round of Strategic Mandate Agreements.

The Commission’s recommendation was not acted on initially. Corridor funding, however, was introduced six years later with only two differences: there was only one corridor — three percentage points above or below a “mid-point” based on weighted funding units (BIUs), the same for all universities — and no contingent threshold. The scheme, for the first time, decoupled the provincial funding formula from determination of the overall basic operating grant. From that point forward, although the formula was designed to determine the overall system-wide operating grant as well as allocate the grant, the only future use of the formula was to allocate the operating grant, however determined. The current plan is to leave the university corridor unchanged, and extend it to colleges asymmetrically — plus three percentage points and minus seven — the strategic objective being to buffer small, northern, and rural colleges against declining student demand.

The Bovey commission had a different idea: “multiple corridors.”

The first – identified as “instruction intensive” — was like the current corridor arrangement, the only difference being the parameters: four percentage points above or below a “baseline.” The term “midpoint” was introduced in 1990. This corridor was in practical effect a default option. It was accessible by all institutions. No threshold performance indicator-like test was required. Increases in enrolled funding units above four per cent would be funded, calculated as a three-year rolling average from the upper band of the corridor, not from the baseline. Decreases, other moving in the opposite direction, were calculated the same way.`

The second and third corridors – called “research intensive” — were more innovative and strategic. Both had a threshold “intensity” metric, which coincidently looks much like some performance indictors in use today and planned for Ontario. The minimum threshold for the first research intensity corridor was that ten per cent of a university’s operating income had to come from a rolling three-year average of grants from the federal research granting councils. Crossing the threshold expanded the plus or minus four per cent default corridor to plus or minus six per cent. Exceeding the threshold to 15 per cent of operating income secured eligibility to a further plus or minus eight per cent “research intensive” corridor. At the time of the report five universities met the minimum research intensity metric: Guelph, McMaster, Queens, Toronto, Waterloo, and Western. Two – McMaster and Toronto – met the 15 per cent threshold.

Although the Commission calculated into which corridor each university would fall, universities that met either “research intensive” threshold could opt for the “instruction intensive” corridor. The reverse, however, was not possible.

This was in practical effect performance funding without public funding, which is why performance funding is sometimes called “incentive” funding. For “research intensive” universities the fiscal incentive was not more funding but greater autonomy to redirect existing income generated under the province’s enrolment-driven formula from instruction to research. For “instruction intensive” universities the incentive was to optimize revenue by managing enrolment according to net revenue per student place made possible by program “weights” within the province’s funding formula: fewer students in higher weight programs could displace more students in lower weight programs. That universities might do this was not unforeseen by the Commission. The principal alternative considered and analyzed by the Commission was recalibration of the program weights within the funding formula. The alternative was found to be too disruptive and too limiting to promote institutional differentiation and autonomy. A little-known fact, then and now, is that the Ontario operating grants formula is explicitly about only the allocation of revenue to each institution. Although it is possible to use the formula to allocate funding down the program level, there is no requirement that universities do so. If there were, corridors would have very little strategic value.

That there could be performance incentives without public funding was unusual at the time, but, later, public universities in Spain, Switzerland, and two American states entered “performance contracts” with government to trade decreases in enrolment-driven public funding for increases in autonomy, in some cases to do exactly what the Commission foresaw. The idea was that greater autonomy would expand the universities’ access to alternative sources of funding, promote system-wide differentiation, and enable innovation by de- coupling formulaic revenue from expense. The government, in the simplest sense, avoided the cost of a financial incentive by trading away some regulatory power in return for institutions’ advancing public policy by “performing” in response to the incentive of expanded autonomy.

Four years after the installation of corridor funding in Ontario, the government of the day introduced “key performance indicators” to which, initially, no funding was attached. They were, however, like the Bovey commission’s thinking, intended to have a fiscal effect on institutional behaviour. The basic idea was that students as consumers needed to know more about the province’s colleges and universities. When economist Michael Spence received the Nobel Prize for his work on asymmetric markets, he was asked by a journalist “whether it was true that you could be awarded the Nobel Prize in Economics for simply noticing that there are markets in which certain participants don’t know certain things that others in the market do know?” The answer, of course, was yes: the degree of asymmetry, if not simple, was surprising, and the market for higher education is highly asymmetrical.

Thus, the original idea behind the Key Performance Indicators was to strike a balance of information between buyers and sellers in a market for higher education. The government’s reasoning was that if the information provided by the key performance indicators was added to the information already made available in the market by universities and colleges, students would then make better choices, and, in theory anyway, select programs and institutions with higher employment rates, lower default rates, and so on. No changes were then proposed either in the province’s funding for universities or in the formula by which the funding was allocated. In other words, a performance funding incentive without more public funding. The cost to government was not financial. In this case the government avoided the cost of a financial incentive by ceding some regulatory power to a more symmetrical market. Funding, as an incentive, would follow student choice as students, better informed, opted for higher performing colleges and universities.

What goes around, comes around . . . maybe.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s