Validating Research Performance Metrics Against Peer Rankings
Validating Research Performance Metrics Against Peer Rankings
A rich and diverse set of potential bibliometric and scientometric predictors of research performance quality and importance are emerging today, from the classic metrics (publication counts, journal impact factors and individual article/author citation counts) to promising new online metrics such as download counts, hub/authority scores and growth/decay chronometrics. In and of themselves, however, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power. The natural criterion against which to validate metrics is expert evaluation by peers, and a unique opportunity to do this is offered by the 2008 UK Research Assessment Exercise, in which a full spectrum of metrics can be jointly tested, field by field, against peer rankings.
bibliometrics, citations, open access, reasearch assessment
Harnad, Stevan
442ee520-71a1-4283-8e01-106693487d8b
30 May 2008
Harnad, Stevan
442ee520-71a1-4283-8e01-106693487d8b
Harnad, Stevan
(2008)
Validating Research Performance Metrics Against Peer Rankings.
Ethics in Science and Environmental Politics, 8 (11).
Abstract
A rich and diverse set of potential bibliometric and scientometric predictors of research performance quality and importance are emerging today, from the classic metrics (publication counts, journal impact factors and individual article/author citation counts) to promising new online metrics such as download counts, hub/authority scores and growth/decay chronometrics. In and of themselves, however, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power. The natural criterion against which to validate metrics is expert evaluation by peers, and a unique opportunity to do this is offered by the 2008 UK Research Assessment Exercise, in which a full spectrum of metrics can be jointly tested, field by field, against peer rankings.
Image
image005.png
- Accepted Manuscript
Text
filelist.xml
- Accepted Manuscript
Image
image002.jpg
- Accepted Manuscript
Image
image001.png
- Accepted Manuscript
Image
image008.png
- Accepted Manuscript
Image
image003.png
- Accepted Manuscript
Image
image004.png
- Accepted Manuscript
Image
image006.png
- Accepted Manuscript
Text
esep-harnad.html
- Accepted Manuscript
Image
image007.png
- Accepted Manuscript
Text
esep-harnad.pdf
- Accepted Manuscript
Text
esep-harnad.rtf
- Accepted Manuscript
Show all 12 downloads.
More information
Published date: 30 May 2008
Keywords:
bibliometrics, citations, open access, reasearch assessment
Organisations:
Web & Internet Science
Identifiers
Local EPrints ID: 265619
URI: http://eprints.soton.ac.uk/id/eprint/265619
PURE UUID: 5e239948-16d4-4773-bb06-b48dff841b53
Catalogue record
Date deposited: 27 Apr 2008 17:44
Last modified: 15 Mar 2024 02:48
Export record
Contributors
Author:
Stevan Harnad
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics