Altmetrics In Research Evaluation

Lily Troia is a Digital Scholarship Librarian who works as Engagement Manager for Altmetric.

The second morning panel at 4:AM, Altmetrics in Research Evaluation, kicked off with a brief introduction from Chair Donna Okubo, Senior Advocacy Manager at PLOS. The session offered three unique approaches to research evaluation, two addressing broader frameworks for understanding the cultural and ethical trends in scholarship, plus updates and perspective from the Canadian research sphere.

First Kate Williams (University of Cambridge) spoke about “Altmetrics in practice: understanding emerging cultures of evaluation,” and her research exploring the social impact of using societal impact metrics. Kate employs sociological and ethnographic analysis to better understand the “space between fields that shapes the culture of evaluation,” and assess impact in a wider context. She discussed how altmetrics and other new forms of evaluative capital can inform development of this framework, but more research is needed around how metrics take meaning in practice.

Kate further explained this “space between” involves all stakeholders in the research ecosystem — from scholars to policy makers, publishers, and the public — and :shifting permeable borders to allow techniques to transfer and exist in this hybrid space.”

Kate identified various categories of impact measurement like academic, political, media-related, and economic, and described her ethnographic study conducted at the World Bank. Early insights about the organization’s changing culture note an increased focus on research impact, incorporation of altmetrics into these evaluative services, and strong investment from managers, leadership, communications teams, and even individual researchers who understand the push to “go above and beyond” in terms of promoting wide reach of their scholarship.

She closed by noting altmetrics have gained much legitimacy, but there is a need to better understand user activity and motivation, a process made more difficult by the constant state of flux inherent to the digital environment.

Next Rebecca Kennison (K|N Consultants) gave an update on the Mellon-funded HuMetricsHSS project, which explores the potential of altmetrics as value-based indicators. The goal is to create a framework addressing “all aspects of scholarly life well-lived,” nurturing values in practice, and empowering scholars to enrich their impact narratives.

HuMetricsHSS will work with humanities and social science scholars to start, and have identified an initial five core components to the framework: equity, openness, collegiality, quality, and community. Rebecca pointed out without collegiality, none of the other value areas will matter. The group has begun testing in conversations with scholars, and will launch its first workshop to examine further next week.

The project shares many goals with the altmetrics community, like efforts to expose, highlight, recognize, and reward all research activities, including peer review, teaching, mentoring, conference organizing, data curation, committee work, public scholarship, and more. Rachel used the syllabus as an example noting the output’s ability to reveal insights around the circulation of ideas, variances in citation practices, inclusive representation of sources, and the role of students themselves in the scholarly conversation. By aligning these inquiries with the HumetricsHSS rubric, a scholar can promote the five value areas, and hopefully be evaluated on the same.

Rachel closed by encouraging those in the research community to reach out and contribute to the initiative. More info can be found at http://humetricshss.org/.

The final presentation featured Martin Kirk, PhD, (University of British Columbia) on the current landscape of research evaluation metrics in Canada. Martin reiterated a common theme: current research evaluation conversations are “ALL about impact.” In Canada there is more demand to show definitive results from taxpayer funding, and a highly competitive climate for securing these funds.

Altmetrics have gained traction more slowly in Canada than other nations, in part due to a lack of block funding and less-metric-focused research culture. However Martin highlighted several climate shifts, like increased emphasis on returns on investment and collaboration. Other big questions altmetrics could address include societal impact at the output, departmental, or institutional level, benchmarking against peers, and informing decisions around who we reward, who we hire, and with whom we should partner. Currently UBC uses SciVal assess competencies in various disciplines and predict new research trends, and the Altmetric Explorer to analyze competition in strategic areas, unearth media attention, and dive into rich attention detail at the individual output or aggregated level.

Martin concluded the Canadian evaluation ecosystem is spotty but developing in significant ways. He emphasized the need for current tools to incorporate more flexibility and integration, and the imperativeness for clean, comprehensive data sets and more nuanced impact metrics. “Bottom line,” he said, “We need to transform institutional use of metrics and tools from interesting to mission critical.

Before breaking for lunch, the panel took a few questions, including one on how to encourage more strategic approaches to altmetrics. Martin pointed out researchers already spend 42% of their days on administrative tasks, so better integration of data will help alleviate the time impediment. Kate shared how the World Bank is very particular about research performance evaluation and implements a large matrix including web analytics and altmetrics in assessment of both publications and individuals. Rachel Kennison closed by encouraging an alignment of strategy with a researcher’s career arc, asking them to envision their scholarly contributions in the future, versus an approach that only looks backwards to identify impact.