The head of the Tertiary Education Commission admits the gains being made from the multi-million dollar Performance-Based Research Fund are “marginal” and whether it remains a worthwhile endeavour is an “open question”. 

Tim Fowler made the comments during the education select committee probe.

“We’re getting to the point where very marginal gains are being made in comparison to what we saw over the first decade,” he told the committee.

Last year Auckland University was the biggest beneficiary of the $315 million fund, receiving $93.5m. Otago University was next at $62.8m, then Massey University with $39m and Victoria University with $36.6m. 

The fund has existed for 22 years and is a fixed pool of money distributed to tertiary providers according to a formula based on previous research performance.  

It does not fund specific research projects, but gives a lump sum to the tertiary institute to then distribute as it sees fit.

However, preparing for this evaluation of performance is costly and time-consuming.  

Labour tertiary education spokesperson Deborah Russell said the fund was initially a way of capturing the quality of research coming out of tertiary institutions.  

“There was a concern then that perhaps we didn’t understand what research academics were doing, and we didn’t understand how they were contributing to the pool of knowledge and developing other researchers.”

Tertiary institutions’ research funding

But she said now it was much clearer to see what was coming out of the sector and it was not obvious all the work required to compete for the fund was still necessary.  

“It takes a lot of work for the Tertiary Education Commission to administer it, and it takes a lot of work for compliance so there might be better ways of allocating that money. 

“An assessment needs to be on whether we continue in current form or whether there’s a better way.” 

The Minister for Tertiary Education and Skills, Penny Simmonds, said it was still early days but it was something she was considering. 

A review of the fund was completed a few years ago, but was criticised by the Tertiary Education Union for not going far enough.

The then-government approved a number of changes in mid-2021 to do with encouraging and rewarding a wider range of research, but its fundamentals remained the same.

It provides Victoria University with about a quarter of the total research funding it gets each year. 

Its deputy vice-chancellor in charge of research, Margaret Hyland, agreed that how the fund was administered, and what it was seeking to achieve, needed to be looked at.  

“It does have a big compliance cost, particularly the quality evaluation component. Every single academic has to prepare an evidence portfolio, but there’s not only the work of the academics and preparing their evidence portfolio, there’s a whole set of institutional work around ascertaining who is and who isn’t eligible to put in a quality evaluation.”

She said the question also needed to be asked whether all of this evaluation was still answering the right question?  

 “The [fund] drives behaviour, right, it says this is the thing that we value and the question that many people have is, is it actually measuring the right thing, because it’s so easy to fall to simple metrics, and simple metrics can be gamed.   

 “So yes, I think it’s timely to look at it but I do think we have to do it really carefully because it sets the incentives for the whole system.”  

She said any changes needed to be aware that retaining the funding for research remained of the utmost importance.   

 “It is part of the Education and Training Act that we deliver research-based teaching, and so we need to be funded for the research part of that.   

“It’s actually an essential part of what universities do, and an essential part, in my view, of what we should be funded for.” 

Likewise, University of Auckland academic lead Simon Holdaway said the funding was essential to the sector.  

“Those funds are really important for maintaining that research ecosystem around New Zealand … that funding that comes is absolutely essential to the university sector.” 

But he said the way the fund was administered was at odds with how things were done overseas, and could be looked at. 

“It’s different that in effect it’s a block amount of money that’s given to the institution based on an overall assessment of all of the individual researchers … If you look internationally, the Australians, the UK, for instance, don’t use individuals, they use groups of researchers.” 

Fowler made the same comment to the committee. 

“You’ll appreciate that the fund is done as an individual academic exercise so each individual academic staff member has their own rating and their own evidence portfolio … that doesn’t necessarily align with the way in which research is done within universities, it’s often done in groups, it’s often done across systems and across institutions and across faculties. 

“So on the upside it’s stable, people understand it, and it has over a long period of time provided us with really good robust evidence of the quality of the research that has been delivered across our institutions.  

“On the downside it’s extremely compliance-heavy for us to run it. It’s a back-breaking, six-year gestation period every round, and that is just the beginning of it. The institutions themselves have to put in a lot of effort and a lot of administrative effort to make it work.” 

The compliance-heavy evaluation component equates to 55 percent of an institution’s fund weighting, with research completions making up 25 percent and income from other sources making up the rest.  

The evaluations happen periodically, with the next one due in 2026.  

Join the Conversation

3 Comments

  1. PBRF should be got rid of immediately. As suggested it did have some initial benefits but these have long evaporated. The logical place to evaluate university research is as part of the periodic AQA which review overall university performance. In what other job do you have a national evaluation of one part of a person’s job? Getting rid of it would be a significant cost saving at TEC, in getting rid of massive “research” bureauacrats in universities and freeing up a very significant proportion of staff time for core teaching and research. Such systems internationally are also a key factor in over-publishing (salami slicing) and in leading to research misconduct. It could be dropped today with no negative consequences and considerable benefits.

  2. Yes, the PBRF is not getting the gains it did in the early years. And that’s because academics have lifted their game. So, it has been a success in that respect. I always found that putting together a PBRF portfolio was something I needed to do anyway, as an academic, for that purpose – and for others, like going for promotion, being accepted to overseas conferences and institutions etc. The big weakness that I saw was that it was not clear PBRF actually went to core research. In the early days it was treated as a kind of “slush fund” to pay for conference trips and the like (all essentially, incidentally – you have to keep up with your peers). I felt that the money that each institution gets could be better used by actually funding near-miss research grant applications at Marsden, HRC and so on. So, if an Auckland researcher had a narrow miss at, say, a Marsden or HRC grant, and they also had a high PBRF score, then they could apply to have their grant funded out of the Auckland allocation. That way you could get more research funded. At present PBRF monies as far as I know, and contrary to the tenor of the article, do not go on funding competitive grant applications.

  3. This so called ‘performance’ fund was gamed from its second round in 2006 as universities competed by narrowing definitions of research to mostly that of counting peer reviewed journal articles. Rankings for such journals favoured articles published in American and European journals. Applied research and that related to New Zealand economic, social and cutural development was undervalued and good staff pushed out to improve institutional research ratings. Teaching and community outreach were devalued in universities. It has been a bureaucratic distortion for at least ten years – time for an ACT style bonfire of this over regulation.

Leave a comment