Daily Bulletin

The Times Real Estate

.

  • Written by Merlin Crossley, Deputy Vice-Chancellor Education and Professor of Molecular Biology, UNSW

The measures by which we judge scientists are always under intense scrutiny. For those who hit the peak of their field, there’s the Nobel Prize. But across all levels of career progression, we publish research papers in journals whose importance or rank can be communicated via a number known as the Journal Impact Factor.

The much respected Nobel Prize Twitter site @NobelPrize recently tweeted an impressive video with four Nobel Laureates speaking out against Journal Impact Factors.

My view is that the Nobel Laureates are right in theory. But I cannot advise the junior researchers I mentor to ignore Impact Factors.

Although imperfect, Impact Factors retain some validity. But more importantly, deep down, I know that as the world of research expands and as people become increasingly specialised, the use of proxy metrics, like Journal Impact Factors and citations, will increase not decrease.

Criticism of Journal Impact Factors

Nobel Laureates Peter Doherty, Bruce Beutler, Joseph Goldstein and Paul Nurse aren’t alone in their criticism of Journal Impact Factors.

The widely supported San Francisco Declaration makes the same point – you can’t judge the quality of research by just looking at the Journal Impact Factor.

Australia’s major medical research funding body, the National Health and Medical Research Council is also officially opposed to Impact Factors and has essentially outlawed reporting them in grant applications.

The Australian Research Council once had a list of A star, A, B and C ratings for journals in its Excellence in Research Australia research assessment exercise but has now abandoned that list and recommends against institutions continuing to use it.

In theory all these august bodies are correct. Impact Factors represent the average number of citations for each paper in the journal over a two year period. They are unreliable. They can be gamed in various ways, such as including a lot of reviews in a journal, and they can be heavily influenced by one or two “jackpot” papers.

In summary, Journal Impact Factors are a crude short cut to the proper job of estimating quality – they are a type of pre-judgement, a prejudice.

image You don’t have to be tall to be good at basketball. But it certainly helps. from www.shutterstock.com

Picking a researcher or a grant application on the basis of Impact Factors is like selecting a basketball team on the basis of one single metric – like the height of the players.

It’s ridiculous.

But hold on – have you ever looked at the heights of players in any professional basketball team?

Nearly all the players are giants.

Standing tall among giants

I would love to take the Laureates’ advice, and read the papers and judge the science on its own merits. But sadly I am only expert in a very small area. I am not capable of critically analysing most of the research I come across.

It is not that peer review doesn’t work. It works for publications. I only review papers in the small field where I truly am an expert. But when it comes to grant review or making academic appointments I am often out of my field.

So I confess. I do look at Impact Factors. I look at citation metrics. I even count papers.

I regret to say that in reviewing perhaps a hundred grants or job applications and trying to find the ten grants to fund or one person to employ, I do not read every paper in the bibliography and assess the research on the basis of my limited understanding. I just don’t have the time or expertise to read and judge all the papers.

I pick my basketball team in part based on the player’s height and past match statistics. I want the people I appoint to get grants in the future and I suspect other grant reviewers also look at metrics too, so I can’t ignore them.

What is the best advice for young researchers?

In their video the Nobel Laureates said that doing sustained, solid, research was the best way to build a reputation. But with grant success rates falling to less than 20%, it is not clear solid research alone will be enough to sustain a lab. So while the advice to downplay Impact Factors is good for established researchers, this is not always feasible for junior researchers.

When I was starting out I also lamented the fact that those in authority seemed to want everything – lots of papers, and papers in journals with high Impact Factors, as well as preliminary data prior to the grant even being funded.

image Young scientists need resilience to keep their careers moving forwards. 86083886@N02/flickr, CC BY-NC

A wise colleague looked at me with raised eyebrows and said,

I thought you were meant to be smart. You’re meant to work it out.

You’re meant to balance your research so you deliver some solid work, and some high impact papers, and to manage your resources to produce preliminary data for new applications, while simultaneously delivering on the main research goals of your current grant or start up funding.

I think this was good advice. It is up to each of us to optimise our output. Aim as high as you can but don’t be silly and waste your career trying to lodge one paper in Nature at all costs.

Those in academic management do not want to make the wrong decisions and only use Impact Factors and other metrics as one indicator and often as a last resort. They, and you, should consider your whole portfolio. Concentrate on these things:

  1. Produce a number of first author papers in any Impact Factor journal. New journals such as PLoSONE will publish solid work that isn’t world shattering in its significance. The ability to initiate and wrap up multiple projects is highly valued

  2. Establish a focus and academic reputation for being an expert in one area or technique, especially in something that is on the up

  3. Collaborate with one or perhaps two leading labs but do not spread yourself too thinly

  4. Do aim for high Impact Factor papers but know when to give up – knowing when to give up is actually more important than clinging to your dreams and never saying die (something that is dangerously over-rated in my view!)

  5. Most importantly, ask yourself whether you are enjoying it and whether you can handle the hard knocks that research delivers – others can sense this, and tend to support people who have resilience in their DNA.

Impact Factors and citations aren’t perfect, but nor are they worthless. Metrics are simply indicators or messengers; in themselves, they are not really the problem.

The problem is the rapidly escalating level of competition for grants and jobs. In our world, as it exists, one has to take many measures into account and my expectation is that hard, cold, imperfect numbers will continue to be important in science.

Authors: Merlin Crossley, Deputy Vice-Chancellor Education and Professor of Molecular Biology, UNSW

Read more http://theconversation.com/why-i-disagree-with-nobel-laureates-when-it-comes-to-career-advice-for-scientists-80079

Business News

Why Brite Storage Tanks Are a Game-Changer for Your Beverage Production

Whether you’re brewing a refreshing pale ale or perfecting your cider, every step shapes the final product. That’s where brite storage tanks truly shine. Crafting beverages Australians love takes mor...

Daily Bulletin - avatar Daily Bulletin

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals