HRB blog

Insight, opinion and analysis of the work we do, the work we fund, and important issues in health and social care research.

Published:

Good questions lead to better design in health research

A study of HRB data shows that questions on funding applications can improve the design and usability of health studies.

Professor Declan Devane
Professor Declan Devane

How can you make sure that your health research will have an impact? One way is to ask the right questions in your study. The right questions will help to make sure you are exploring a gap in knowledge, and finding something that will help fill this knowledge.

In short, asking the right questions means you can make a difference.

Part of the solution to helping researchers ask better questions lies in the funding application form, according to a study of HRB data.

The study called Irish funder guidance increased searching for, and uptake of, core outcome sets was published earlier this year in the Journal of Clinical Epidemiology and involved the HRB, the HRB Trials Methodology Research Network (HRB-TMRN) and the COMET (Core Outcome Measures in Effectiveness Trials) Initiative.

The researchers discovered that including a direct and specific question on application forms increased the use of ‘core outcome sets’ in study proposals.

 

Core outcome sets for harmonious findings

Core outcome sets are the minimum, standard set of outcomes that should be measured and reported in all research studies in specific health areas, and they help to harmonise studies, explains study co-author Professor Declan Devane.

Declan is Professor of Health Research Methodology at University of Galway and Scientific Director of the HRB-Trials Methodology Research Network, an organisation that looks to improve the quality and design of health trials in Ireland.

“For health research to be effective, consistency is key in how findings are reported," explains Professor Devane. "Imagine an orchestra where each musician plays a different tune. Without coordination, it's just noise. We need harmony in research just as much as in music to truly understand the impact.”

If researchers instead use core outcome sets when designing their trials, then the findings of those studies can be compared and pooled, and this provides stronger evidence for or against a particular intervention, he notes.

“Think of core outcome measures as handing out the same sheet music to every musician in the orchestra. Each study's impact is amplified when it harmonizes with others, creating a cohesive and powerful symphony of evidence.”

The number of core outcome sets continues to grow, and new disease and other areas of health care are being covered all the time. So, the chances of finding a set which is useful to a study are always increasing.

"Core outcome sets not only make studies stronger and relevant, they also help to maximise the value of the research", adds study co-author Dr Anne Cody, who is Head of Investigator-Led Grants, Research Careers and Enablers at the HRB.

“This is an important consideration for research funders,” says Dr Cody. “If we fund studies that use available core outcome sets, there is less duplication and waste, because the research is providing more relevant research with greater potential for impact. This means it’s better value for money. We need to make sure that public funding spent on research is used to gain the most impact.” 

 

Hitch your study to a COMET

Since 2014, HRB guidance documents have included a statement in support of core outcome sets:

The HRB encourages the development and application of agreed standardised sets of outcomes, known as “core outcome sets”, such as those reported by the COMET (Core Outcome Measures in Effectiveness Trials) Initiative."

In 2020, this guidance was updated, and applicants looking for HRB funding were specifically asked,

Have you searched the COMET database to check whether a core outcome set has been agreed for this area of health?

Did adding the direct question make a difference?

To find out, the researchers went back through HRB funding application records and followed up with the relevant Principal Investigators about how and why they had chosen the outcomes to measure in their studies.

They found that when the HRB only issued guidance about core outcome sets, only five out of 111 applicants (6%) identified or proposed them as part of their applications.

In contrast, in 2020, when applicants were specifically prompted on the form about checking the COMET database, this number saw a significant change. In this group, 75 out of the 76 applicants (99%) stated that they had searched the COMET database, with 10 identifying a relevant core outcome set that helped them decide on what outcomes they chose to measure in their study.

 

Questions as levers

“What we found strongly suggests that the HRB's guidance to applicants contributed to increasing awareness of core outcome measures, and directly impacted the search for and use of core sets,” says Dr Cody.

“So yes, including a specific question on the funding application appears to be a useful lever for encouraging people to be aware of core outcome sets and to use them in the design of their trial or study.”

The study shows how important such questions can be for supporting good practice in health research, and highlights how questions are a part of the funder’s toolkit, she adds.

“We use our application forms as a way to guide better study design. We have used a similar approach to increase the uptake of Cochrane Reviews when they were new, and we continue to use the application form to embed Public and Patient Involvement,” she says.

“And from this study we have evidence that the specific questions we include can make a difference. So now as funders we need to think about the specific questions, and strike a balance to make sure that funding applications improve study design further, and at the same time do not become overly long for applicants and reviewers.”

 

More ethics, less waste

Prof. Devane agrees that using funding applications as an instrument to improve study design makes sense, and that better design will not only lead to stronger evidence, but more ethical, cost-effective and valuable research.

“Every trial, every health study, involves time and resources - often publicly funded - and in many cases patients or their data are involved,” he says.

“We need to do everything we can to make sure that the studies are designed and carried out in a way that gets the greatest return on those investments, so that we have the best evidence about health interventions.”

You can read the full study, titled Irish funder guidance increased searching for, and uptake of, core outcome sets by Claire Beecher, Sandra Galvin, Anne Cody, Paula R. Williamson, Karen Hughes, Oonagh Ward, Caitriona Creely and Declan Devane here: https://doi.org/10.1016/j.jclinepi.2023.03.019