The University of Southampton
University of Southampton Institutional Repository

Validating Research Performance Metrics Against Peer Rankings

Validating Research Performance Metrics Against Peer Rankings
Validating Research Performance Metrics Against Peer Rankings
A rich and diverse set of potential bibliometric and scientometric predictors of research performance quality and importance are emerging today, from the classic metrics (publication counts, journal impact factors and individual article/author citation counts) to promising new online metrics such as download counts, hub/authority scores and growth/decay chronometrics. In and of themselves, however, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power. The natural criterion against which to validate metrics is expert evaluation by peers, and a unique opportunity to do this is offered by the 2008 UK Research Assessment Exercise, in which a full spectrum of metrics can be jointly tested, field by field, against peer rankings.
bibliometrics, citations, open access, reasearch assessment
Harnad, Stevan
442ee520-71a1-4283-8e01-106693487d8b
Harnad, Stevan
442ee520-71a1-4283-8e01-106693487d8b

Harnad, Stevan (2008) Validating Research Performance Metrics Against Peer Rankings. Ethics in Science and Environmental Politics, 8 (11).

Record type: Article

Abstract

A rich and diverse set of potential bibliometric and scientometric predictors of research performance quality and importance are emerging today, from the classic metrics (publication counts, journal impact factors and individual article/author citation counts) to promising new online metrics such as download counts, hub/authority scores and growth/decay chronometrics. In and of themselves, however, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power. The natural criterion against which to validate metrics is expert evaluation by peers, and a unique opportunity to do this is offered by the 2008 UK Research Assessment Exercise, in which a full spectrum of metrics can be jointly tested, field by field, against peer rankings.

Image
image005.png - Accepted Manuscript
Download (65kB)
Text
filelist.xml - Accepted Manuscript
Download (382B)
Image
image002.jpg - Accepted Manuscript
Download (14kB)
Image
image001.png - Accepted Manuscript
Download (98kB)
Image
image008.png - Accepted Manuscript
Download (184kB)
Image
image003.png - Accepted Manuscript
Download (164kB)
Image
image004.png - Accepted Manuscript
Download (114kB)
Image
image006.png - Accepted Manuscript
Download (32kB)
Text
esep-harnad.html - Accepted Manuscript
Download (49kB)
Image
image007.png - Accepted Manuscript
Download (263kB)
Text
esep-harnad.pdf - Accepted Manuscript
Download (498kB)
Text
esep-harnad.rtf - Accepted Manuscript
Download (4MB)

Show all 12 downloads.

More information

Published date: 30 May 2008
Keywords: bibliometrics, citations, open access, reasearch assessment
Organisations: Web & Internet Science

Identifiers

Local EPrints ID: 265619
URI: http://eprints.soton.ac.uk/id/eprint/265619
PURE UUID: 5e239948-16d4-4773-bb06-b48dff841b53
ORCID for Stevan Harnad: ORCID iD orcid.org/0000-0001-6153-1129

Catalogue record

Date deposited: 27 Apr 2008 17:44
Last modified: 15 Mar 2024 02:48

Export record

Contributors

Author: Stevan Harnad ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×