Measuring Impact

Scholars, academic departments, and universities increasingly are asked to explain the impact of their research to external funders and to measure themselves against their peers. Tracking citations and attempting to measure research impact isn’t new, but in recent years the number of available tools has grown significantly. “Traditional” metrics, based on citations in formal publications such as journals or books, have been joined by “alternative” metrics, attempting to capture paper downloads, views of supplemental data sets and presentations, and mentions of research on Twitter, Facebook, or other social media. As the tools have evolved, the pressure to use them has also increased.

Georgetown's Scholarly Communications Committee recently invited a panel of experts to address the questions facing faculty, administrators, librarians and publishers about how well traditional metrics and altmetrics assess the quality and impact of a scholar’s contributions. Each presentation by our panelists was outstanding on its own, and when combined with the others yielded significant insights and raised critical questions about the role and value of metrics in assessing scholarship at Georgetown and beyond. Watching the video of these lively and informative presentations will deepen your understanding of metrics through the combined experience, expertise, wisdom, and good humor of our panelists.

Rachel Borchardt, Science Librarian, American University
Blaise Cronin, Rudy Professor Emeritus of Information Science, Indiana University
Jacques Berlinerblau, Director and Professor of Jewish Civilization, Georgetown University
Richard Brown, Director, Georgetown University Press (Moderator)

Watch the video: Assessing Impact: Altmetrics

View the PowerPoint presentations:
Rachel Borchardt, Metrics: A Quick Introduction
Blaise Cronin, A Fetish too Far? (Alt)metrics in the Groves of Academe

We hope the information below will lead to a continuing conversation about the thoughtful and nuanced use of metrics to evaluate research and scholarship. We welcome your suggestions and feedback.

Georgetown University Library's Research Impact Guide

Research Impact Guide (PDF)

Limitations of Research Metrics

Using metrics to measure research impact holds both promise and peril. Many researchers and research administrators welcome measurements that are easily quantified, devoid of favoritism or bias, and that might provide an opportunity to compare similar research outputs. But research metrics can also be misunderstood and misused. Individual authors and journal editors can try to “game the system” and artificially boost their scores on certain metrics, most notably on journal Impact Factor. Researchers and administrators can also push metrics to support conclusions that ignore the metric’s limitations. A recent article notes that one scientist observed that this pressure for “scoring well” is especially acute on junior researchers: “Some of them come in [and say] ‘if I don’t get a Cell, Science, or Nature [article] I’m not going to get a faculty position.’ And that comes from the impact factor hype.”

Although altmetrics have had fewer charges of “gaming,” their novelty makes their significance harder to understand: do views and downloads actually mean people read the paper, or just saved it? Do tweets or Facebook posts about a researcher’s new book indicate significant engagement with its scholarship, or merely passing mentions? Several “high-impact” journals, including Nature and PNAS, have published recent warnings about over-relying on metrics, and advocated a more balanced picture: one where quantitative measurements complement the qualitative judgments of peers and experts. The Leiden Manifesto, developed at an international conference for science and technology indicators, notes: “Indicators must not substitute for informed judgment. Everyone retains responsibility for their assessments.”

Recommended Reading

Costas R, Zahedi Z, Wouters P. “Do ‘Altmetrics’ Correlate With Citations? Extensive Comparison of Altmetric Indicators With Citations From a Multidisciplinary Perspective.” Journal of the
Association for Information on Science and Technology 66 n10 (October 2015): 2003-2019. DOI:10.1002/asi.23309

Cronin B and Sugimoto CR, eds. Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact. Cambridge: MIT Press, 2014.

Cronin B and Sugimoto CR, eds. Scholarly Metrics Under the Microscope: From Citation Analysis to Academic Auditing. Medford, New Jersey: Information Today, 2015.

Priem J, Piwowar HA, Hemminger BM. “Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact.” March 20, 2012. Available from arXiv.org. arXiv:1203.4745

Roemer RC and Borchardt R. “Altmetrics.” Library Technology Reports 51 n5 (July 2015). DOI: 10.5860/ltr.51n5

Roemer RC and Borchardt R. Meaningful Metrics: A 21st Century Librarian's Guide to Bibliometrics, Altmetrics, and Research Impact. Chicago: Association of College and Research Libraries, 2015.

Wilsdon J., et al. The Metric Tide. (July 2015). DOI: 10.13140/RG.2.1.4929.1363, 10.13140/RG.2.1.5066.3520, 10.13140/RG.2.1.3362.4162. Available from https://responsiblemetrics.org/the-metric-tide/