MugsPubs

How are Journals Ranked: Part Two

A handy guide to Journal Rankings!

A handy guide to Journal Rankings!

Following on from my last blog post on how journals are ranked (part one available here), we’re continuing with the theme of “impact”, what this means and how it’s measured.

In part one, I touched upon alternative ways of measuring impact and various databases that catalogue journals, including Scopus, Google Scholar and Dimensions.

Today, I’m going to be homing in on two of the most popular ways of ranking journals, the Journal Impact Factors (JIFs) and the Altmetric Attention Scores (AAS).

Given that JIFs are probably the most well-established metric for ranking journals, and Altmetric is one of the industry’s most popular alternative metrics for measuring impact, it seemed only fitting that these two yardsticks receive their own separate blog post.

There’s a lot to talk about, so let’s dive in!

Altmetric Attention Scores

If you haven’t already guessed, publishers just love using citations to rank journals. Citations are great to an extent, they can indicate how impactful the research has been to the development of science and the field. But what do citations tell us about the impact of a journal in the real world?  Na-da. Altmetric aims to fill this niche. Altmetric tracks journal and article “mentions” – social media, news, blogs, policy documents and patents. Not one “mention” of citations in sight. The point of this is to measure the engagement of science in real life. This has some seriously interesting implications, both positive and negative.

From a positive standpoint, Altmetric enables you to see how research and journals are being engaged with by the public, news and media, as well as policymakers. It answers the question of to what extent research is accessible to the public and to those it aims to affect. The Altmetric Attention Score is a nice feature, because all articles receive a score, and it increases based on the amount and variety of engagement the research is receiving worldwide. Those articles that have seen far reaching engagement are likely to have had the most impact on society, and therefore have a higher AAS.

On the flipside, there have been examples of research that have achieved shockingly high Altmetric scores for all the wrong reasons. Controversial research or just plain bad science can still achieve a high AAS, especially if it gets highlighted in the media or picked up by the internet trolls of this world.  Often research is taken out of context and used to evidence people’s individual biases or personal agendas. Listen up people, this is why peer review and clear scientific communication is so important!

Back to Altmetric itself, the tool ranks all journals by those that have the most mentions, you can add a specific selection of journals to your search to see how they rank, as well as apply filters to see different interpretations of the rankings. It’s pretty neat!

Altmetric have a range of free tools, but the Altmetric Explorer (for publishers) comes at a cost. If your publisher/institution already pays for collaboration with Digital Science (which also owns Dimensions by the way), then you might be lucky and get to access Altmetric.

Journals ranked blog
Altmetric Attention Score donut example

Pros:

Finally, a move away from using citations to rank journals, Altmetric has multiple uses for publishers, but also for authors. If journals have the Altmetric “donut” (or badge) added to each article, authors can easily track if their article goes viral. Another big pro is that some smart cookie at Altmetric recently added a feature to search articles by type of access, which means we can now compare the huge positive impact Open Access articles have had in society, compared to the relatively limited impact of closed access articles.

Cons:

In the Altmetric Explorer, there is no way to do a direct comparison of the impact of one journal to another at the same time (the results are always combined). So if you want to directly compare two journals, you have to do two separate searches and compare the results manually offline. Hopefully Altmetric is frantically working on this as a feature as I write this! Another con is that if a poor quality or controversial article gets attention for the wrong reasons, Altmetric can’t tell you if an article is getting attention for a good reason or for a bad one.

Journal Citation Reports

Any list about journal rankings and impact would be a bit of a failure if it didn’t talk about the Journal Citation Reports. Yes, we’re back on citations again as a ranking, but this time we’re talking about the most well-regarded system of ranking journals: Impact Factors. You might have already heard about Journal Impact Factors. You take the number of citations to articles published in the previous two years, divide that by the number of “citable items” (aka articles that can be cited), sprinkle in a dash of magic and hey presto, you have your Impact Factor. There’s a bit more to achieving an Impact Factor, but without going into all the details, that’s it in a nutshell.

Journals are ranked by their Impact Factors in the Journal Citation Reports (JCR). Clarivate Analytics is the company that oversees the Journal Impact Factors evaluation process and the JCR. The JCR began evaluating journals according to citations back in 1975. It’s probably the most well-established metric out there for ranking journals. It’s also one of the most heavily criticised for many reasons. One of the key reasons is that it can be artificially manipulated by publishing selectively, for example, by publishing only a small number of review articles (which generally attract large numbers of citations). The JCR also only ranks journals indexed in certain databases (run by Clarivate Analytics), journals not in those databases are not included. This means that the database is not entirely comprehensive. I probably sound a bit negative about Impact Factor, I think I’m just a little jaded about it, but I recognise that there is a place for Impact Factors as a metric, provided that other metrics are considered as well.

Journals ranked blog
Original content created by MugsPubs

Pros:

Impact Factor has long held a monopoly over the way journals are ranked. If you’re new to the publishing industry, I recommend that you get familiar with it because it’s probably here to stay for many, many more years. It’s relatively easy to calculate and it’s a strong metric for measuring quality content.

Cons:

While individual Impact Factors are proudly displayed on many journal websites, the JCR can only be accessed through a paywall (usually through an institution or publisher login). This makes comparing journals challenging without access. As I mentioned above, Impact Factors can be manipulated, and it’s received a lot of criticism in recent years.

With the recent release of the 2021 Impact Factors, I hope that this blog post has come at a time to support some of you who are relatively new to our industry or those who just want to learn more about impact and how the publishing industry ranks journals.

Thanks for supporting my blog, as always, make sure you subscribe to get the latest posts straight to your inbox!

Share Post

Thanks for stopping by my blog today. Make sure you subscribe for more exciting posts about the world of journal publishing!

To support this blog, please enter your email and click on the subscribe button to show me your support and keep up to date with my latest posts.

If you have any questions, drop them in an email! I love a chat!

MugsPubs

Never Miss A Blog Post!

Subscribe Today

We don’t spam! Read our privacy policy for more info.