3:AM – Altmetrics around the world

This guest post was contributed by Natalia Madjarevic, Head of Implementations and Support at Altmetric.

The next session at 3:AM kicked off with Rodrigo Costas from CWTS discussing practical applications for altmetrics and the development of ‘social media profiles’, looking at how research is discussed across Africa, Europe and North America. Introducing the study, Costas described the need to further develop altmetrics descriptive indicators, moving beyond looking only at the numbers and encouraging broader and qualitative analysis of altmetrics data.

In this study, the team analysed three million publications included in Web of Science with DOIs/PMIDs published between 2012 and 2014. Using Altmetric data, the data was analysed by region (Africa, Europe and USA) by pulling out basic altmetrics indicators (i.e. raw counts: no. tweets, policy documents, news mentions, etc.) and delving deeper into the qualitative data. By analysing the underlying data (e.g. tweets), the study identified ‘communities of tweeters’ across the various regions. It found, for example, African thematic clusters of discussions on Twitter about research topics such as Open Access, HIV, and health issues. European Twitter thematic landscapes identified found people tweeting about research issues such as social anxiety, autism and obesity.


This qualitative study highlights that a focus on descriptive altmetrics indicators provide a more contextual perspective (who is talking about the research? How is it being used?). The study recommends a stronger move towards more descriptive approaches to altmetrics so we can better understand public awareness of and engagement with research using qualitative data.

Next up, Htet Htet Aung from Nanyang Technological University discussed an ongoing study – A Worldwide Survey: Investigating Awareness and Usage of Traditional metrics and Altmetrics among Researchers. The objectives of the ongoing study are to investigate researcher awareness and use of traditional metrics and altmetrics, including possible hindrances, and investigate how researchers promote their work.

The survey is currently targeting researchers around the world and is running from August-December 2016. Sharing some preliminary findings (38 responses collected in August 2016, so a small sample), the team found researchers were most familiar with traditional bibliometric indicators including Journal Impact Factor, Total Number of Citations and the H-Index. Researcher awareness of altmetrics found the most well-known indicators in the preliminary findings were: number of article views/downloads and no. readers.

This study is yet to collect many responses but I’d be interested to see the final results – particularly answering questions about hindrances to using altmetrics in practice and recommendations for how this could be improved.

Finally in this session, Stacy Konkiel and Abheer Kolhatkar from Altmetric discussed the possibility of “baked in” bias for altmetrics. The study looked at how altmetrics data could be regionally biased as providers track largely Western-oriented attention sources (e.g. Facebook and Twitter). They discussed potential factors affecting regional bias in altmetrics, e.g. a focus on tracking sources with the highest social media use rates, language barriers, censorship and a reliance on DOIs.  

slack-for-ios-upload-7The team analysed research shared on two social networks: Facebook and VK.com – the Russian-language social network with 100m active monthly users. The study asked:

  • Are papers discussed on VK more often Russian-authored? (Preliminary finding: No.)
  • Are papers discussed on Facebook less often Russian-authored? (Preliminary finding: Yes.)

Next steps for the study: further research into effects of national research publication rates and further study of VK.

This was a useful session for considering how regional data aggregation could bias altmetrics results, and the way in which research is discussed across regions.