The University of Southampton
University of Southampton Institutional Repository

Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme

Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme
Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme
Background: This report reviews approaches and tools for measuring the impact of research programmes, building on, and extending, a 2007 review.

Objectives: (1) To identify the range of theoretical models and empirical approaches for measuring the impact of health research programmes; (2) to develop a taxonomy of models and approaches; (3) to summarise the evidence on the application and use of these models; and (4) to evaluate the different options for the Health Technology Assessment (HTA) programme.

Data sources: We searched databases including Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature and The Cochrane Library from January 2005 to August 2014.

Review methods: This narrative systematic literature review comprised an update, extension and analysis/discussion. We systematically searched eight databases, supplemented by personal knowledge, in August 2014 through to March 2015.

Results: The literature on impact assessment has much expanded. The Payback Framework, with adaptations, remains the most widely used approach. It draws on different philosophical traditions, enhancing an underlying logic model with an interpretative case study element and attention to context. Besides the logic model, other ideal type approaches included constructionist, realist, critical and performative. Most models in practice drew pragmatically on elements of several ideal types. Monetisation of impact, an increasingly popular approach, shows a high return from research but relies heavily on assumptions about the extent to which health gains depend on research. Despite usually requiring systematic reviews before funding trials, the HTA programme does not routinely examine the impact of those trials on subsequent systematic reviews. The York/Patient-Centered Outcomes Research Institute and the Grading of Recommendations Assessment, Development and Evaluation toolkits provide ways of assessing such impact, but need to be evaluated. The literature, as reviewed here, provides very few instances of a randomised trial playing a major role in stopping the use of a new technology. The few trials funded by the HTA programme that may have played such a role were outliers.

Discussion: The findings of this review support the continued use of the Payback Framework by the HTA programme. Changes in the structure of the NHS, the development of NHS England and changes in the National Institute for Health and Care Excellence’s remit pose new challenges for identifying and meeting current and future research needs. Future assessments of the impact of the HTA programme will have to take account of wider changes, especially as the Research Excellence Framework (REF), which assesses the quality of universities’ research, seems likely to continue to rely on case studies to measure impact. The HTA programme should consider how the format and selection of case studies might be improved to aid more systematic assessment. The selection of case studies, such as in the REF, but also more generally, tends to be biased towards high-impact rather than low-impact stories. Experience for other industries indicate that much can be learnt from the latter. The adoption of researchfish® (researchfish Ltd, Cambridge, UK) by most major UK research funders has implications for future assessments of impact. Although the routine capture of indexed research publications has merit, the degree to which researchfish will succeed in collecting other, non-indexed outputs and activities remains to be established.

Limitations: There were limitations in how far we could address challenges that faced us as we extended the focus beyond that of the 2007 review, and well beyond a narrow focus just on the HTA programme.

Conclusions: Research funders can benefit from continuing to monitor and evaluate the impacts of the studies they fund. They should also review the contribution of case studies and expand work on linking trials to meta-analyses and to guidelines.

Funding: The National Institute for Health Research HTA programme.
Raftery, James
27c2661d-6c4f-448a-bf36-9a89ec72bd6b
Hanney, Steve
7de69786-9f68-4c45-9581-6949378f8a03
Greenhalgh, Trish
77ee9078-d2df-45b6-81c7-67887a957ded
Glover, Matthew
2fc20d71-01a2-4097-ba03-ee006681e767
Blatch-Jones, Amanda Jane
6bb7aa9c-776b-4bdd-be4e-cf67abd05652
Raftery, James
27c2661d-6c4f-448a-bf36-9a89ec72bd6b
Hanney, Steve
7de69786-9f68-4c45-9581-6949378f8a03
Greenhalgh, Trish
77ee9078-d2df-45b6-81c7-67887a957ded
Glover, Matthew
2fc20d71-01a2-4097-ba03-ee006681e767
Blatch-Jones, Amanda Jane
6bb7aa9c-776b-4bdd-be4e-cf67abd05652

Raftery, James, Hanney, Steve, Greenhalgh, Trish, Glover, Matthew and Blatch-Jones, Amanda Jane (2016) Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme. Health Technology Assessment, 20 (76). (doi:10.3310/hta20760). (PMID:27767013)

Record type: Article

Abstract

Background: This report reviews approaches and tools for measuring the impact of research programmes, building on, and extending, a 2007 review.

Objectives: (1) To identify the range of theoretical models and empirical approaches for measuring the impact of health research programmes; (2) to develop a taxonomy of models and approaches; (3) to summarise the evidence on the application and use of these models; and (4) to evaluate the different options for the Health Technology Assessment (HTA) programme.

Data sources: We searched databases including Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature and The Cochrane Library from January 2005 to August 2014.

Review methods: This narrative systematic literature review comprised an update, extension and analysis/discussion. We systematically searched eight databases, supplemented by personal knowledge, in August 2014 through to March 2015.

Results: The literature on impact assessment has much expanded. The Payback Framework, with adaptations, remains the most widely used approach. It draws on different philosophical traditions, enhancing an underlying logic model with an interpretative case study element and attention to context. Besides the logic model, other ideal type approaches included constructionist, realist, critical and performative. Most models in practice drew pragmatically on elements of several ideal types. Monetisation of impact, an increasingly popular approach, shows a high return from research but relies heavily on assumptions about the extent to which health gains depend on research. Despite usually requiring systematic reviews before funding trials, the HTA programme does not routinely examine the impact of those trials on subsequent systematic reviews. The York/Patient-Centered Outcomes Research Institute and the Grading of Recommendations Assessment, Development and Evaluation toolkits provide ways of assessing such impact, but need to be evaluated. The literature, as reviewed here, provides very few instances of a randomised trial playing a major role in stopping the use of a new technology. The few trials funded by the HTA programme that may have played such a role were outliers.

Discussion: The findings of this review support the continued use of the Payback Framework by the HTA programme. Changes in the structure of the NHS, the development of NHS England and changes in the National Institute for Health and Care Excellence’s remit pose new challenges for identifying and meeting current and future research needs. Future assessments of the impact of the HTA programme will have to take account of wider changes, especially as the Research Excellence Framework (REF), which assesses the quality of universities’ research, seems likely to continue to rely on case studies to measure impact. The HTA programme should consider how the format and selection of case studies might be improved to aid more systematic assessment. The selection of case studies, such as in the REF, but also more generally, tends to be biased towards high-impact rather than low-impact stories. Experience for other industries indicate that much can be learnt from the latter. The adoption of researchfish® (researchfish Ltd, Cambridge, UK) by most major UK research funders has implications for future assessments of impact. Although the routine capture of indexed research publications has merit, the degree to which researchfish will succeed in collecting other, non-indexed outputs and activities remains to be established.

Limitations: There were limitations in how far we could address challenges that faced us as we extended the focus beyond that of the 2007 review, and well beyond a narrow focus just on the HTA programme.

Conclusions: Research funders can benefit from continuing to monitor and evaluate the impacts of the studies they fund. They should also review the contribution of case studies and expand work on linking trials to meta-analyses and to guidelines.

Funding: The National Institute for Health Research HTA programme.

Text
FullReport-hta20760.pdf - Other
Download (3MB)

More information

Accepted/In Press date: 1 April 2016
e-pub ahead of print date: 21 October 2016
Published date: October 2016
Organisations: Faculty of Medicine

Identifiers

Local EPrints ID: 402248
URI: http://eprints.soton.ac.uk/id/eprint/402248
PURE UUID: d9c80294-8a05-4af3-bb41-4f8696287ce5
ORCID for Amanda Jane Blatch-Jones: ORCID iD orcid.org/0000-0002-1486-5561

Catalogue record

Date deposited: 03 Nov 2016 16:42
Last modified: 17 Mar 2024 02:56

Export record

Altmetrics

Contributors

Author: James Raftery
Author: Steve Hanney
Author: Trish Greenhalgh
Author: Matthew Glover

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×