Budget Impact Analysis of AI-Assisted Fracture Detection in the NHS
Our latest peer-reviewed paper demonstrates real promise of significant cost savings from AI. Whilst the clinical efficacy of AI applications in radiology has been well demonstrated in clinical studies, less is known about the potential impact economic impacts across a range of cost drivers such as changes in clinical decision making, patient referrals, patient visits, treatment costs … essentially any change in the use of resources after initial review of the medical image.
Since starting our Health Economics and Outcomes Research (HEOR) department in 2021, we have been working with our radiology clients to address this knowledge gap. And from our experience so far, this gap exists for a number of reasons:
Healthcare systems are complex: characterising and quantifying system-level benefits is challenging.
Outcomes research takes time and resources: collecting the data required to support these benefit claims is often more lengthy than clinical studies and is therefore more costly.
HEOR evidence generation is not always a priority for technology developers: driven by fragmentation in payers’ requirements for this evidence when it comes to reimbursement, despite health technology assessment agencies stipulating it be accounted for.
Conducting research to address these evidence gaps is essential for building trust in novel technologies, optimising their adoption, and ultimately, ensuring their potential reaches the end goal of improving the delivery of healthcare for those receiving and providing it.
Over the past 2 years, our team has worked with Gleamer and NHS stakeholders to assess the resource and budget impact of adopting AI to assist the review of X-rays in NHS emergency care for suspected wrist, ankle and hip fractures. This blog provides an overview of how we did it, what we found, and some general tips for those developing an evidence-based value proposition.
Our approach emulated best practice guidance for health economic evaluation:
Step 1 - Desk research
A targeted literature review to identify unmet needs and the appropriate decision model. The type of literature included clinical guidance, previous health economic assessment and cost/burden of illness studies. The findings of this desk research generated a set of potential economic outcomes to assess in our evaluation as well as the appropriate decision model.
Step 2 - Stakeholder validation
Validation of economic outcomes and modelling assumptions. It is essential to involve stakeholders throughout the process of an evaluation to ensure the assumptions being made reflect real-world clinical practice.
Step 3 - Data collection for the economic model inputs
This proved to be the biggest challenge as the health economic literature on AI in radiology and current radiology practice is sparse. To address this challenge we worked with NHS stakeholders to extract primary data.
Step 4 - Analysis and validation of the base care results
We built and ran our custom budget impact model, testing its outputs with relevant stakeholders.
Step 5 - Quality check and scenario testing
Many assumptions are made in health economic models, which introduce uncertainty around results. Therefore, it is imperative to test the sensitivity of results to this uncertainty. As healthcare is not one-size-fits-all, we ran alternative scenarios to assess how the economic impact could vary across different regions of NHS England.
Step 6 - Dissemination
Letting loose on the conference circuit and getting the seal of peer review approval. At various stages in the evaluation process we presented our progress at conferences. While slightly risky in the earlier stages, it provided great feedback on the assumptions we were making and the implications of our results. This feedback strengthened our work, culminating in its recent acceptance for publication in Value in Health Journal.
Our evaluation adopted a decision tree framework to assess the one-year financial impact of emergency clinicians reviewing radiographs for suspected fractures with and without the use of AI. The outputs of the evaluation provide an estimate of AI’s value but also identify where further research is needed to better assess economic potential.
The full publication can be accessed here, in Value In Health, a leading health economics journal (Impact Factor 5).
To summarise our findings:
The relevant outcomes for assessing the one-year budget impact of AI-assisted fracture detection in emergency care are:
a. Appropriate referrals to fracture clinics
b. Appropriate admission to hospital
c. Return visits to the emergency department
d. Litigation costs
AI for fracture detection has the potential to save the NHS £3,634,392 within a year of implementation.
Clinical performance with and without AI is largely driving this cost saving result.
The potential to reduce litigation costs has limited influence on overall financial value to the NHS.
More research is needed to quantify the budget impact after one year e.g. inclusion of treatment costs.
What health economic evidence means for AI
Our research is amongst the first rigorously conducted health economic analyses for radiology AI, demonstrating real promise for significant cost savings from just one single AI tool. What’s more, given our research only looked at a small proportion of the AI’s capabilities, we can extrapolate that, with further evidence, even greater cost savings could be achieved. For instance, we only looked at the use case for fracture detection in adults, not children, and we only looked at three of the most common fractures - we did not include all other fracture and injury types that the AI is capable of detecting. Not only does this research employ rigorous methodology required of peer review publications, it also aligns with NICE’s Evidence Standards Framework, and will ultimately enable companies with this type of evidence to undergo fill NICE evaluation, hopefully to receive unconditional recommendations for use in the NHS.
Learnings for radiology AI economic evaluations
Although we evaluated a very specific use case, some common lessons can be applied to other radiology or diagnostic technologies when quantifying value beyond clinical efficacy:
Start early to identify potential value, the data required to demonstrate it and avoid wasting opportunities for data collection.
Expect data on the current standard of care (i.e. your comparator) to be missing from the literature.
Continually involve key stakeholders, from conceptualisation to validation of evaluation results.
To summarise, HEOR is a necessary step towards the scaled adoption of AI. It is the information (beyond clinical efficacy) that providers and payers need for optimal decision making, but also promotes a multidisciplinary approach necessary for the successful implementation of AI.
Hardian Health is a clinical digital consultancy focused on leveraging technology into healthcare markets through clinical evidence, market strategy, scientific validation, regulation, health economics and intellectual property.