Innovation flash talks

This guest post is contributed by Peter Harris, Digital Product Manager at Taylor and Francis.

Article Metrics in the Journal Driven Environment – Kornelia Junge, Wiley

The first innovation flash talk of the afternoon began by discussing how Wiley have historically been utilising altmetric data, mainly as a context for individual authors as well as reporting this information back to societies. Kornelia, followed this by highlighting how the Marketing team at Wiley have been looking at innovative new ways of how they can get the most out of their Altmetric data. Kornelia added that this has all been driven by Wiley’s desire to be seen as a publishing company at the forefront of academic research dissemination and to be perceived by their authors as looking at the entire scholarly research picture.

One such innovation is the Wiley Altmetric Alert which was created in order to stay ahead of their authors, and to keep on top of trending articles. The score works by tracking an individual articles Altmetric Attention Scores over time and alerting back if a score has increased by 10 points or more (if the score is<= 100) or the score increased by 10% (if the score is >=100). Once this is triggered, an alert is sent to the marketing team who then explore why this article is getting so much traction and can build marketing campaigns to further promote this article and promote similar articles which might benefit alongside this article.

img_8608

 

Uncovering the meanings behind altmetrics: An exploratory study – Stacy Konkiel, Altmetric.com, David Sommer and Charlie Rapple, Kudos

Next up was a discussion about an upcoming research collaboration between two competitors in the altmetrics aggregator field, Kudos and Altmetric.com. The study aims to get behind the data of Altmetric Attention Scores in order to understand what motivates researchers to use an academic social network, such as Mendeley, Academic.edu or Twitter. Alongside this they hope to quantify, at least in part, what a tweet means, what does a follow on Mendeley mean, all with the commercial purpose of understanding their data better for their customers.

The study, which is in the early stages has initially conducted a series of semi-structured interviews with academic communities present on Twitter and Mendeley. This study aims to add to the relatively small amount of research on how academics engage with social media and for what purpose. As with all product development, the end user of a product will always find innovative ways to use your product which you could never have originally envisaged, and the preliminary findings of this study have shown that researchers usually have a very specific reason for signing up to an academic social network, with one person in the study explaining that they use Twitter as a bookmark tool by only retweet things they intend to read later.

A question from the audience asked, ‘Are they looking at sentiment in this study?’ Stacy answered that No, they were not. But Altmetric.com are looking into this in separate studies.

 

The role of altmetrics methodology for objectivity increase of the prospectively forecasting for creation of new technique – Vitalii Vorotnikov

Following the Kudos and Altmetric.com study talk was Vitalii Vorotnikov who outlines a new methodology for using altmetrics in new product development. Vitalii suggests a model based on scraping personal blogs, specialized forums, and social networks to find out opinions of leading researchers and experts as well as concerns and negative responses which are then analysed to see if an innovation is viable for commercial and public investment.

 

Plum Analytics to the rescue – Phill Hall, Plum Analytics

Something that always made me chuckle within scholarly monograph publishing when working in editorial is authors would always write on their marketing form for their monograph, ‘This book is of interest to the general reader.’ Although this is perhaps a little naïve, it shows that authors of the scholarly monograph do want to engage with the general public and I was therefore looking forward to the next talk by Phill Hall of Plum Analytics. Phill outlined how Plum are concentrating on looking beyond the article and outside of the STM world. Phill discussed how within Plum’s data while articles are still the largest, they are only just 50% of everything considered. The other 50% of items within their indexed items are books, software, conference papers, reports and videos.

The next slide titled Books Matter! Showed that the development of Plum Analytics has been driven by a problem of when looking at eBooks in the humanities and social sciences, citations aren’t really there, so there is a need to examine other metrics. Books are seminal works in the humanities and social sciences and Plum is currently tracking 3.3 million of them, pulling in data from Worldcat – in order to show how many libraries have a holding of that book, Wikipidea mentions, reviews on Amazon, reviews on Goodreads (perhaps not so important in the scholarly world), as well as citations on Scopus, Crosref and Pubmed. Phill outlined how although they are using DOIs to track books and chapters they are also taking into account URL, SSRN, Scopus author ID, OCLC ID and ORCID IDs. In a similar vein to Altmetric for books and Bookmetrix, Plum analytics shows the book DOI and all of the chapter DOIs associated with that book on the page when analysing a particular book and/or chapter.

Phill then examined the case study of the University of Helsinki who are using this data to analyse the ongoing conversations with their faculty’s research.

Prompted by the recent announcement of Altmetrics for books including the Open Syllabus Project data, a question from the audience asked if there were any plans for Plum to include this in their data. Phil replied with a definite ‘Yes, in the US and UK.’ I’m interested to see how this is done. UK university reading/syllabus lists are notoriously badly formatted and not submitted anywhere apart from a Blackboard or Moodle site and then deleted at the end of the course.

Snowball Metrics – a ‘basket of metrics’ across the research workflow to inform research strategy – Christina Lohr, Elsevier

Lastly in the innovation flash talk session, Christina discussed a standardised metric aimed at universities. Christina started her talk by outlining the problem they are trying to solve with the creation of Snowball metrics that universities need standardised metrics to benchmark themselves on like for like basis in order to know their position relative to peers, so they can strategically align resources to their strengths and weaknesses.

Snowball metrics are defined and agreed by universities themselves whilst the output of Snowball Metrics is a set of mutually agreed and tested methodologies or as Christina puts it “recipes”. These recipes are available free-of-charge and can be used by anyone for their own purposes and, if applicable, under their own business models. Christina points out that, like all metrics they should be used as a strong complement to, not a replacement for, peer review and expert opinion when making research management decisions.

Christina then discusses the Snowball Metrics Exchange which highlights how Elsevier are working with university IT departments to set up the database which houses the inputs for the metrics and then to install the data visualisation software needed for analysing the data and with the ability to encrypt and/or export to multiple formats. Institutions utilising snowball metrics can also suggest future metrics to track.

Christina mentions that the feedback they have initially received has been extremely positive with institutions such as the University of St Andrews who wanted the ability to choose and control who they share/benchmark, benchmark internationally and choose a standardised metric system that was already tried and tested.