The University of Southampton
University of Southampton Institutional Repository
Warning ePrints Soton is experiencing an issue with some file downloads not being available. We are working hard to fix this. Please bear with us.

Identification and comparison of key criteria of funding decision feedback to applicants: A funder and applicant perspective

Identification and comparison of key criteria of funding decision feedback to applicants: A funder and applicant perspective
Identification and comparison of key criteria of funding decision feedback to applicants: A funder and applicant perspective
Background: Every year, thousands of research proposals are submitted to funding organisations in the hope of securing funding for research. These applications go through a number of assessment stages in which decisions are made about whether to decline the application, progress the application or fund the application (1–3). Funding organisations often have limited funds, large numbers of application submissions, and a need to demonstrate impactful spending, and so many of these applications go through multiple iterations and not all will be funded. Feedback to the applicants about these decisions are a vital part of the scientific process. Applicants rely on feedback to improve the quality of their application for securing successful funding and/or as a learning opportunity to facilitate the chance of success in future applications. It is therefore important that applicants receive constructive feedback in order to improve the quality of their application, change the content of their application or to understand why their application was or was not successful (was it competitive but there were stronger applications, or were significant improvements required?). A lack of constructive feedback could not only impact on professional development for researchers but could introduce unnecessary burden and negative experiences of researchers as applicants, in what is already a labour intensive research activity (1,4–7). For the National Institute for Health Research (NIHR) Evaluation, Trials and Studies Coordinating Centre (NETSCC), applicants receive written feedback by the research programme funding committees at both stage 1 and stage 2 of the application process. Stage 1 feedback is provided for applications that are progressed to stage 2, as well as for those that were unsuccessful. For applications that successfully progress to stage 2, feedback is again provided for both successful and unsuccessful applications at this second stage. Committees are encouraged to consider providing targeted feedback for each application rather than using generic phrases. Feedback points are drafted in the funding committee meetings and are provided to the applicants in written bullet point format. NIHR applicants also have access to guidance on what to include in their applications and the assessment criteria against which the application will be assessed.

In order to support applicants throughout the funding process, it is vital that the guidance and feedback provided is of high quality, constructive and consistent. It is therefore important to understand the key components of the feedback provided to applicants at Stage 1 and 2 and how these are perceived by applicants. This will lead to increased knowledge on the quality and value of the feedback provided to applicants, whether improvements are needed regarding the type of feedback given and whether this is consistent across the NIHR research programmes managed by NETSCC, promoting the adding value in research approach and transparency agenda.

Objectives: The aim is to identify and compare the key components of feedback that are provided to applicants by NIHR research programmes, whether these components are consistent across different sources of feedback (external peer review, funding committees at stage 1, funding committees at stage 2, guidance documents) and how the feedback is perceived by applicants.

Methods: To gain a NIHR funder perspective of feedback provided, qualitative content analysis of feedback and guidance documents from the Health Technology Assessment (HTA), Public Health Research (PHR), Efficacy and Mechanism Evaluation (EME), and Health Services and Delivery Research (HSDR) funding programmes was conducted. Following ethical approval, purposeful sampling was used to select commissioned-led and researcher-led, and successful and unsuccessful research applications from March 2018 and March 2019. To date, a total of 84 applicants across the four funding programmes have been contacted to gain approvals to access the feedback data. Using a phased approach, (1) a coding framework was created using inductive content analysis of external peer reviews, stage 1 and stage 2 funding committee feedback from the HTA programme, and (2) then applied to the stage 2 funding committee feedback from the remaining funding programmes using deductive content analysis. (3) The final phase will compare the key components across funding programmes and to the guidance criteria provided for each programme. To gain an NIHR applicant perspective about feedback received from funding committees, an online survey was conducted in October 2019. Following ethical approval, the survey used purposeful sampling and targeted chief investigators of applications submitted to HTA, EME, PHR and HSDR research programmes during March 2018 and March 2019 to ensure the sample reached commissioned-led, researcher-led and successful and unsuccessful applicants. The survey was distributed via email to 770 applicants and was also published on Twitter. The survey used both closed and open questions to explore how useful the feedback was and considerations for improvements for both stages 1 and 2. A mixed methods analysis using descriptive statistics and inductive thematic analysis was conducted on closed and open question responses. Results: From the NETSCC perspective, qualitative content analysis of feedback and guidance documents is on-going. To date, feedback documents relating to 25 HTA applications have been analysed. From this, a coding framework has been created. Seven key components (with sub-components) were identified from the feedback given to HTA applicants: (1) Acceptability to patients and professionals: feedback on patient and public involvement in the study design. (2) Team and infrastructure: feedback on justification of the roles and time of the team as well as potential conflicts of interest for the team. (3) Justification of project: feedback on the scientific rationale, relevance and need of the project. (4) Study design: feedback was primarily around clearly defining and justifying study parameters, outcomes and sample size. (5) Risks and contingencies: feedback highlighted the need for consideration to be given to recruitment, retention and attrition, and feasibility of the project. (6) Value for money and costings: feedback around cost elements of the application. (7) Pathway to impact and implementation: feedback around implementation plans and whether project will have clinical impact. The most frequently reported components were study design; risks and contingencies; and team and infrastructure. The majority of the components were consistently reported across the funding committee and external peer reviewer comments, except for the justification of project component, which was only provided in feedback to applicants at stage 1. Recruitment for documents in the remaining research programmes commenced in October 2019. To date, we have recruited 23 applicants across PHR (8), EME (6) and HSDR (9), and the coding framework is being applied to stage 2 funding committee feedback and guidance documents. From the applicant perspective, 147 applicants completed the online survey (response rate of 19%). After removing data from applications under review, non-NETSCC research programmes and incomplete datasets, a final sample of 115 applicant data was analysed. Overall, more than half of applicants agreed that the feedback provided by the funding committee was useful (52%) and of benefit (53%). However, more than half of applicants also disagreed that the feedback was of good quality (54%) or that it was helpful for developing future applications or work (57%). Responses differed depending on whether the applicant had been successful in securing funding, with at least 77% of funded applicants responding positively about the feedback and less than 37% of unsuccessful applicants responding positively. Moreover, feedback at each stage was more likely to be seen positively if the application was moved on to the next stage/funded whilst feedback at the point of rejection was more likely to be considered negatively. For example, unsuccessful applicants at stage 2 regarded stage 1 feedback positively, but regarded stage 2 feedback negatively, finding it unhelpful and unclear on areas for development. The open-ended responses also suggested that feedback could be more detailed and directive to facilitate changes and that it would be helpful to get some indication of how important each comment was (is it mandatory and which to focus more attention on). In contrast to the findings from the NIHR funder perspective, feedback from the applicant perspective showed that feedback is not always seen as being consistent across stages, with guidance, briefs or within the same feedback document. Discussion: The findings will lead to increased knowledge on the quality, transparency and consistency of feedback provided to applicants, whether this is consistent across research programmes and how useful the feedback is to applicants. The preliminary findings show that key components provided in feedback by the different NIHR research programmes were consistently reported across both the funding committee and external peer reviewer comments. However, applicants reported that the content of the feedback was not always consistent across both stages and within stages, and that although applicants found this feedback useful overall, there are a number of areas in which feedback and guidance could be improved. Once the full analysis has been completed, recommendations can then be made for NIHR research programme guidance and feedback to applicants that supports the chance of a successful application, promoting the adding value in research approach and transparency agenda. This work is part of a wider series of studies being undertaken by the NIHR Research on Research team on peer review and decision-making in funding allocation. This wider work includes the studies reported here, a realist synthesis to identify contexts in which different decision-making processes work, a survey with international funders to understand current funding practices, qualitative analysis to understand stakeholders’ expectations and an observational study of funding committees and will provide much needed evidence to help inform funders in peer review and decision-making activities.

References: 1. Guthrie S, Ghiga I, Wooding S. What do we know about grant peer review in the health sciences? An updated review of the literature and six case studies. Santa Monica, CA; 2018. 2. Ismail S, Farrands A, Wooding S, Europe R. Evaluating grant peer review in the health sciences. A review of the literature. Cambridge, UK; 2009. 3. Publons part of the Web of Science Group. Grant Review in Focus. Global State of Peer Review Series. 2019. 4. Gluckman P. Which science to fund: time to review peer review? 2012;(December):11. 5. Graves N, Barnett AG, Clarke P. Cutting random funding decisions. Nature. 2011;469(7330):299. 6. Herbert DL, Coveney J, Clarke P, Graves N, Barnett AG. The impact of funding deadlines on personal workloads, stress and family relationships: a qualitative study of Australian researchers. BMJ Open. 2014;4(3):e004462. 7. Herbert DL, Barnett AG, Clarke P, Graves N. On the time spent preparing grant proposals: an observational study of Australian researchers. BMJ Open. 2013;3(5):1–6.
Fackrell, Kathryn
47992aeb-c6a0-44a2-b59c-8b53d7a70520
Alperin, Juan Pablo
569c3316-ebcd-48e5-a947-5c7426e806fb
Barreto, Cephas A.S.
3581bf84-4cdb-41ef-99d3-5e21d11c4b48
Blatch-Jones, Amanda
6bb7aa9c-776b-4bdd-be4e-cf67abd05652
Bull, Abby
8f6c8577-ff80-43b6-affb-cd0e4cd68f3c
Fraser, Simon
135884b6-8737-4e8a-a98c-5d803ac7a2dc
Kurbatskaya, Ksenia
17ac24ba-8351-4988-a265-a6f20709467c
Meadmore, Katie
4b63707b-4c44-486c-958e-e84645e7ed33
Recio Saucedo, Alejandra
d05c4e43-3399-466d-99e0-01403a04b467
Schwab, Uwe
dfba5a4a-46f5-4292-8c8a-622b78b1adce
Fackrell, Kathryn
47992aeb-c6a0-44a2-b59c-8b53d7a70520
Alperin, Juan Pablo
569c3316-ebcd-48e5-a947-5c7426e806fb
Barreto, Cephas A.S.
3581bf84-4cdb-41ef-99d3-5e21d11c4b48
Blatch-Jones, Amanda
6bb7aa9c-776b-4bdd-be4e-cf67abd05652
Bull, Abby
8f6c8577-ff80-43b6-affb-cd0e4cd68f3c
Fraser, Simon
135884b6-8737-4e8a-a98c-5d803ac7a2dc
Kurbatskaya, Ksenia
17ac24ba-8351-4988-a265-a6f20709467c
Meadmore, Katie
4b63707b-4c44-486c-958e-e84645e7ed33
Recio Saucedo, Alejandra
d05c4e43-3399-466d-99e0-01403a04b467
Schwab, Uwe
dfba5a4a-46f5-4292-8c8a-622b78b1adce

Fackrell, Kathryn, Alperin, Juan Pablo, Barreto, Cephas A.S., Blatch-Jones, Amanda, Bull, Abby, Fraser, Simon, Kurbatskaya, Ksenia, Meadmore, Katie, Recio Saucedo, Alejandra and Schwab, Uwe (2020) Identification and comparison of key criteria of funding decision feedback to applicants: A funder and applicant perspective. 2nd International Conference on Peer Review 2020, online. (doi:10.48448/2ntk-sb87).

Record type: Conference or Workshop Item (Paper)

Abstract

Background: Every year, thousands of research proposals are submitted to funding organisations in the hope of securing funding for research. These applications go through a number of assessment stages in which decisions are made about whether to decline the application, progress the application or fund the application (1–3). Funding organisations often have limited funds, large numbers of application submissions, and a need to demonstrate impactful spending, and so many of these applications go through multiple iterations and not all will be funded. Feedback to the applicants about these decisions are a vital part of the scientific process. Applicants rely on feedback to improve the quality of their application for securing successful funding and/or as a learning opportunity to facilitate the chance of success in future applications. It is therefore important that applicants receive constructive feedback in order to improve the quality of their application, change the content of their application or to understand why their application was or was not successful (was it competitive but there were stronger applications, or were significant improvements required?). A lack of constructive feedback could not only impact on professional development for researchers but could introduce unnecessary burden and negative experiences of researchers as applicants, in what is already a labour intensive research activity (1,4–7). For the National Institute for Health Research (NIHR) Evaluation, Trials and Studies Coordinating Centre (NETSCC), applicants receive written feedback by the research programme funding committees at both stage 1 and stage 2 of the application process. Stage 1 feedback is provided for applications that are progressed to stage 2, as well as for those that were unsuccessful. For applications that successfully progress to stage 2, feedback is again provided for both successful and unsuccessful applications at this second stage. Committees are encouraged to consider providing targeted feedback for each application rather than using generic phrases. Feedback points are drafted in the funding committee meetings and are provided to the applicants in written bullet point format. NIHR applicants also have access to guidance on what to include in their applications and the assessment criteria against which the application will be assessed.

In order to support applicants throughout the funding process, it is vital that the guidance and feedback provided is of high quality, constructive and consistent. It is therefore important to understand the key components of the feedback provided to applicants at Stage 1 and 2 and how these are perceived by applicants. This will lead to increased knowledge on the quality and value of the feedback provided to applicants, whether improvements are needed regarding the type of feedback given and whether this is consistent across the NIHR research programmes managed by NETSCC, promoting the adding value in research approach and transparency agenda.

Objectives: The aim is to identify and compare the key components of feedback that are provided to applicants by NIHR research programmes, whether these components are consistent across different sources of feedback (external peer review, funding committees at stage 1, funding committees at stage 2, guidance documents) and how the feedback is perceived by applicants.

Methods: To gain a NIHR funder perspective of feedback provided, qualitative content analysis of feedback and guidance documents from the Health Technology Assessment (HTA), Public Health Research (PHR), Efficacy and Mechanism Evaluation (EME), and Health Services and Delivery Research (HSDR) funding programmes was conducted. Following ethical approval, purposeful sampling was used to select commissioned-led and researcher-led, and successful and unsuccessful research applications from March 2018 and March 2019. To date, a total of 84 applicants across the four funding programmes have been contacted to gain approvals to access the feedback data. Using a phased approach, (1) a coding framework was created using inductive content analysis of external peer reviews, stage 1 and stage 2 funding committee feedback from the HTA programme, and (2) then applied to the stage 2 funding committee feedback from the remaining funding programmes using deductive content analysis. (3) The final phase will compare the key components across funding programmes and to the guidance criteria provided for each programme. To gain an NIHR applicant perspective about feedback received from funding committees, an online survey was conducted in October 2019. Following ethical approval, the survey used purposeful sampling and targeted chief investigators of applications submitted to HTA, EME, PHR and HSDR research programmes during March 2018 and March 2019 to ensure the sample reached commissioned-led, researcher-led and successful and unsuccessful applicants. The survey was distributed via email to 770 applicants and was also published on Twitter. The survey used both closed and open questions to explore how useful the feedback was and considerations for improvements for both stages 1 and 2. A mixed methods analysis using descriptive statistics and inductive thematic analysis was conducted on closed and open question responses. Results: From the NETSCC perspective, qualitative content analysis of feedback and guidance documents is on-going. To date, feedback documents relating to 25 HTA applications have been analysed. From this, a coding framework has been created. Seven key components (with sub-components) were identified from the feedback given to HTA applicants: (1) Acceptability to patients and professionals: feedback on patient and public involvement in the study design. (2) Team and infrastructure: feedback on justification of the roles and time of the team as well as potential conflicts of interest for the team. (3) Justification of project: feedback on the scientific rationale, relevance and need of the project. (4) Study design: feedback was primarily around clearly defining and justifying study parameters, outcomes and sample size. (5) Risks and contingencies: feedback highlighted the need for consideration to be given to recruitment, retention and attrition, and feasibility of the project. (6) Value for money and costings: feedback around cost elements of the application. (7) Pathway to impact and implementation: feedback around implementation plans and whether project will have clinical impact. The most frequently reported components were study design; risks and contingencies; and team and infrastructure. The majority of the components were consistently reported across the funding committee and external peer reviewer comments, except for the justification of project component, which was only provided in feedback to applicants at stage 1. Recruitment for documents in the remaining research programmes commenced in October 2019. To date, we have recruited 23 applicants across PHR (8), EME (6) and HSDR (9), and the coding framework is being applied to stage 2 funding committee feedback and guidance documents. From the applicant perspective, 147 applicants completed the online survey (response rate of 19%). After removing data from applications under review, non-NETSCC research programmes and incomplete datasets, a final sample of 115 applicant data was analysed. Overall, more than half of applicants agreed that the feedback provided by the funding committee was useful (52%) and of benefit (53%). However, more than half of applicants also disagreed that the feedback was of good quality (54%) or that it was helpful for developing future applications or work (57%). Responses differed depending on whether the applicant had been successful in securing funding, with at least 77% of funded applicants responding positively about the feedback and less than 37% of unsuccessful applicants responding positively. Moreover, feedback at each stage was more likely to be seen positively if the application was moved on to the next stage/funded whilst feedback at the point of rejection was more likely to be considered negatively. For example, unsuccessful applicants at stage 2 regarded stage 1 feedback positively, but regarded stage 2 feedback negatively, finding it unhelpful and unclear on areas for development. The open-ended responses also suggested that feedback could be more detailed and directive to facilitate changes and that it would be helpful to get some indication of how important each comment was (is it mandatory and which to focus more attention on). In contrast to the findings from the NIHR funder perspective, feedback from the applicant perspective showed that feedback is not always seen as being consistent across stages, with guidance, briefs or within the same feedback document. Discussion: The findings will lead to increased knowledge on the quality, transparency and consistency of feedback provided to applicants, whether this is consistent across research programmes and how useful the feedback is to applicants. The preliminary findings show that key components provided in feedback by the different NIHR research programmes were consistently reported across both the funding committee and external peer reviewer comments. However, applicants reported that the content of the feedback was not always consistent across both stages and within stages, and that although applicants found this feedback useful overall, there are a number of areas in which feedback and guidance could be improved. Once the full analysis has been completed, recommendations can then be made for NIHR research programme guidance and feedback to applicants that supports the chance of a successful application, promoting the adding value in research approach and transparency agenda. This work is part of a wider series of studies being undertaken by the NIHR Research on Research team on peer review and decision-making in funding allocation. This wider work includes the studies reported here, a realist synthesis to identify contexts in which different decision-making processes work, a survey with international funders to understand current funding practices, qualitative analysis to understand stakeholders’ expectations and an observational study of funding committees and will provide much needed evidence to help inform funders in peer review and decision-making activities.

References: 1. Guthrie S, Ghiga I, Wooding S. What do we know about grant peer review in the health sciences? An updated review of the literature and six case studies. Santa Monica, CA; 2018. 2. Ismail S, Farrands A, Wooding S, Europe R. Evaluating grant peer review in the health sciences. A review of the literature. Cambridge, UK; 2009. 3. Publons part of the Web of Science Group. Grant Review in Focus. Global State of Peer Review Series. 2019. 4. Gluckman P. Which science to fund: time to review peer review? 2012;(December):11. 5. Graves N, Barnett AG, Clarke P. Cutting random funding decisions. Nature. 2011;469(7330):299. 6. Herbert DL, Coveney J, Clarke P, Graves N, Barnett AG. The impact of funding deadlines on personal workloads, stress and family relationships: a qualitative study of Australian researchers. BMJ Open. 2014;4(3):e004462. 7. Herbert DL, Barnett AG, Clarke P, Graves N. On the time spent preparing grant proposals: an observational study of Australian researchers. BMJ Open. 2013;3(5):1–6.

This record has no associated files available for download.

More information

Published date: 30 September 2020
Venue - Dates: 2nd International Conference on Peer Review 2020, online, 2020-09-01

Identifiers

Local EPrints ID: 451478
URI: http://eprints.soton.ac.uk/id/eprint/451478
PURE UUID: 86746da1-531c-47eb-8cb6-71f59d515035
ORCID for Amanda Blatch-Jones: ORCID iD orcid.org/0000-0002-1486-5561
ORCID for Simon Fraser: ORCID iD orcid.org/0000-0002-4172-4406
ORCID for Katie Meadmore: ORCID iD orcid.org/0000-0001-5378-8370
ORCID for Alejandra Recio Saucedo: ORCID iD orcid.org/0000-0003-2823-4573

Catalogue record

Date deposited: 30 Sep 2021 16:31
Last modified: 13 Dec 2021 03:04

Export record

Altmetrics

Contributors

Author: Juan Pablo Alperin
Author: Cephas A.S. Barreto
Author: Abby Bull
Author: Simon Fraser ORCID iD
Author: Ksenia Kurbatskaya
Author: Katie Meadmore ORCID iD
Author: Uwe Schwab

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×