For a long time, published research has been judged by the standard citation metrics of journal impact factor, and for individual authors, the number of publications and how widely cited they are. In 2005, Jorge E. Hirsch, a physicist at the University of California San Diego, developed a metric for individual authors, called the h-index (link is external). Originally intended for use by theoretical physicists its use has been expanded across the breadth of research. This metric tracks the productivity and citation impact of each author to produce a single number, which allows the relative quality of individual researchers to be compared.
However, many (link is external) have (link is external) voiced (link is external) doubts (link is external) about the merit of these metrics. For journal impact factor, many argue that this metric is not relevant to highly specialised research fields, is weighted towards review articles and can be skewed by self-citation. Similarly, for the h-index, which is growing in popularity, this can be skewed by researchers who publish a high number of inconsequential papers, and shows issues with gender and age bias. The majority of researchers admit that these metrics are at best a flawed representation of quality.
Increasingly, as science funding becomes more competitive and public engagement and outreach activities become more valued, alternative metrics are needed to give a more accurate and complete picture of interest garnered by a piece of research. Altmetrics, a term coined in 2010, are non-traditional impact metrics. They typically track a research article and give data that is seen by many as more relevant in today’s research environment. These metrics extend beyond a simple citation count and can include the number of article views online, the number of times an article may be saved into a reference management system, the number of news articles covering the research and how many times the research is mentioned on various social media platforms, like Facebook and Twitter.
A number of projects have already begun to build this data, such as Plum Analytics (link is external) and Altmetric (link is external). In fact, to date Altmetric have tracked almost 8 million individual research articles, and you can see that the recent publication from the University of Leeds regarding Affimer protein performance (link is external) is already within the top 5% of all articles ever tracked according to these alternative metrics.
Recognition of the value of altmetrics to the research community has come by the way of interest shown in these metrics from research (link is external) funders (link is external), an increase in their use by journals (link is external), and their adoption (link is external) as part of the Research Excellence Framework (REF) within the UK and by specific Universities in the US. In fact, there are many who now believe that altmetrics will soon form a standard part of a researcher’s CV, possibly removing the ‘alt’ from these metrics.
How do you measure yours?