Daily Bulletin

The Times Real Estate

.

  • Written by The Conversation Contributor
image

The results from the latest university research audit indicate that research in Australia is improving.

This suggests that the Excellence in Research for Australia (ERA) exercise is working: ERA has achieved its main aim of boosting the quality of Australian research.

However, this headline statement masks a plethora of concerns.

Under the government’s latest reform of research funding, academics will be assessed not only on their quality of research through the ERA, but also on the economic, social and environmental impacts of their research through a new impact framework

The impact and engagement measures herald a new era that rewards researchers for collaborating beyond their institutions.

It is timely, then, to reassess ERA’s utility. Is it fit for purpose? Will these two assessment systems complement or contradict one another?

What has gone well in ERA?

The ERA processes have recognised peer review alongside metrics.

Research efforts at universities are arguably now more focused towards areas of strength. There is a clearer (though contested and arguably narrower) understanding of scholarly research, particularly that which is non-traditional.

On paper, ERA has established a system whereby research can be compared nationally and against international benchmarks.

What isn’t working?

Individual researchers are not assessed by ERA per se. However, they are assessed in line with ERA at the institutional level — in a system that awards a single score for an entire discipline cohort.

Inter-disciplinary research has been disadvantaged. ERA’s 1,238 fields of research (FoR) codes make it problematic for researchers to publish outside their discipline or academic unit.

Publishing, performing or exhibiting internationally is perceived to be more prestigious than in Australia. This unjustified exoticism diminishes the importance of Australian research and puts local and Australian publication outlets at risk.

A lack of transparency and accountability remains a critical problem.

The process by which final rankings are calculated remains opaque. It is unclear how the peer review of evaluation units is moderated and benchmarked globally. The rationale for inclusion, exclusion and change in the list of journals recognised by ERA has not been made public.

Whole disciplines ranked “below world average” are reliant on empirical research to fathom what went wrong. There is no feedback other than the score.

Esteem measures are narrow. The category “prestigious work of reference”, for example, is strikingly limited. It has never been opened to public discussion. Why have some publications been chosen and others omitted?

The ERA journal rankings were abolished in 2011. However, their ghost influences decisions from journal selection to academic recruitment and promotion.

Universities still reward publication in high-ranking journals from the list; some institutions recognise only research published in A or A* journals, or those marked “quality” in the current list.

As predicted, the editorial boards of these journals are struggling to cope with the influx of submissions. Lower-ranked journals and those with lower impact factors are struggling to survive. Many Australian journals are disadvantaged by the bias towards international journals.

The audit culture most affects early career academics. They and others struggle to negotiate the system, juggle heavy teaching loads and manage the precarity of casual academic employment.

The international mobility of Australian academics is high and early career academics are the most likely to move overseas or leave higher education.

The loss of young academics from an ageing academic workforce risks Australia’s ability to meet future demand. Moreover, it impairs capacity for innovation.

What are the concerns?

Measuring engagement according to research income from industry is concerning.

How, for example, will collaborative research with not-for-profits and innovative start-up companies be measured? How will the new measures account for these organisations’ exemptions from a cash contribution for Australian Research Council Linkage proposals?

There is a contradiction between a new impact measure that encourages a culture of risk-taking and ERA, which promotes risk-avoidance behaviours and impacts upon academic freedom by directing research behaviour. This is particularly problematic for new researchers, blue-sky research and research with benefits that emerge only in the long term.

Both systems place professional service outside academic workloads. This raises new questions. Who will edit the journals, convene the conferences, become officers of professional associations, or write the handbooks and textbooks?

These activities are essential to the health of all disciplines. Increasingly, they are unrecognised and unrewarded. This has long-term ramifications for both research quality and impact.

Neither system recognises investments in partner communities that are critical to social licence to operate in many disciplines.

Improving ERA

Has ERA run its course? Perhaps. It certainly needs improvement.

The ERA process should be subject to external review. We need greater transparency about the criteria that inform assessment categories. We need discussion of categories not yet opened to consultation.

Given concerns over gaming the system, we need an audit of data that has been excluded from ERA submissions. There should be a review of disciplinary membership of the committees in terms of institutional representation through time.

We need ERA to cease peer reviews of outputs already subject to double-blind peer review.

There is a dire need to review the real cost of each ERA exercise, which runs approximately every three years. We need to consider whether the costs of assessing research excellence exceed the benefits.

While the ARC’s administrative and departmental costs are low, we also need to assess the costs of university compliance and of playing an effective strategic assessment game.

The new impact and engagement measures redress some of ERA’s deficiencies, but the challenges of cost, transparency, audit culture and external oversight remain. And teaching remains out in the cold.

Authors: The Conversation Contributor

Read more http://theconversation.com/will-the-impact-framework-fix-the-problems-the-research-audit-found-52152

Business News

Why Brite Storage Tanks Are a Game-Changer for Your Beverage Production

Whether you’re brewing a refreshing pale ale or perfecting your cider, every step shapes the final product. That’s where brite storage tanks truly shine. Crafting beverages Australians love takes mor...

Daily Bulletin - avatar Daily Bulletin

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals