Next-generation altmetrics: responsible metrics and evaluation for open science
This guest post was contributed by David Sommer, Product Director and Co-founder at Kudos
Re-caffeinated after the break and eager for the next session at 3:AM, the audience was ready to hear about the next generation of altmetrics focusing on responsible metrics and evaluation for open science. Isabella Peters from Kiel University was our chair.
Isabella talked through the aims of the EU project on responsible metrics and evaluation. Robust metrics were defined as having the following 5 characteristics:
The EU project will deliver a report by the end of 2016. As we know traditional metrics are mostly based on citations and usage. A range of altmetrics services is now available, but altmetrics are seen to have issues of coverage, transparency, gaming and the level of acceptance by the research community and decision makers.
Isabella presented the initial results from the expert group looking at altmetrics. This was based on a call for evidence on how people are using Altmetrics and gives us a first view of the results. So far, 19 valid responses have been received covering individuals, publishers, research institutions, leaned societies and companies from UK, Germany, France, Sweden, Switzerland, Poland, Romania, Belgium and Netherlands. Looking at some of the key questions asked:
Which EU member states are using Altmetrics and in what ways?
Most are either not using them at all or only starting to look at them. The most common reasons for not using metrics given were gaming, misuse, concern about bias and lack of reproducibility. One institution stated that “Altmetrics are not seriously regarded as tools for assessment”, while another commented “Researchers see altmetrics as a fun way to measure impact”.
What is the potential for Altmetrics?
The most common responses were as an incentive for open science, as an addition to citations and that they could be used to measure impact on society.
Who should be developing metrics?
The majority said that this must include all stakeholders, working together in an open way. It should not be limited to so-called “experts”.
What are the prerequisites to make metrics work?
Transparency, openness and reproducibility were cited as the most important factors. No “black boxes” or bespoke, hidden algorithms.
Metrics are seen as drivers and an outcome of open science. They should be integrated into the reward system for open science. If metrics are treated as a closed shop, there is no reproducibility and transparency. The aim is to tear down the walled gardens.
Research on research is needed. We need to study and understand what alternatives might be for qualitative assessment such as peer review. We need a culture of evaluation and the EU should take a leading role here.
An invigorating discussion than followed! There was a question about what the business model for Altmetrics might be. Could the EU fund a global altmetric service for the good of all?
There was discussion about opening the black box. Right now we have closed data and closed methods. Could commercial altmetric providers publish and use open methods, even if the data itself is closed.
Crossref through their Crossref Event Data service are positioning themselves as the collector and distributor of “altmetric type events”. They are not creating metrics but providing data on events. Crossref plan to be a bridge to provide evidence about events that happened and will offer full transparency on how they process the data. Crossref talked about how there is the original data, something in the middle and the metrics. Different users will want access to different elements. Crossref want to provide the infrastructure but they don’t see their role as policing the data or offering a level of interpretation.
So, a lively session, some clear conclusions about the need for transparency and openness and some exciting infrastructure projects under way that will enable this community to do new and interesting things.