Introduction
Radiology services across the UK and Europe are under increasing pressure with increasing demand and decreasing resources. According to the Royal College of Radiologists (RCR), the UK has a shortage of over 30% of consultant radiologists, a figure projected to rise to 40% by 2027 if current trends continue [1]. The RCR report that annual radiology workloads have grown by more than 30% over the past five years, while workforce expansion has failed to keep pace [1]. In 2022 alone, NHS hospitals in England and Wales reported over 45 million imaging examinations, of which chest radiographs remain the most commonly requested examination, with 22.0 million procedures [2].
These systemic pressures have driven interest in practical artificial intelligence (AI) applications to support diagnostic workflows, alleviate human resource burdens, and enhance consistency in reporting. Particularly in examinations such as chest radiography, where a high percentage of images show no findings, the use of autonomous AI tools presents a scalable and clinically relevant opportunity.
As AI technologies continue to evolve, their clinical application in healthcare is moving from theoretical promise to demonstrable impact. Within radiology, AI has shown particular value in addressing longstanding challenges related to growing diagnostic workloads, radiologist shortages, and variable reporting quality.
This case study contributes to an evolving body of evidence demonstrating real-world AI integration in radiology. It documents how Leiden University Medical Centre (LUMC), an academic teaching hospital in the Netherlands, adopted and implemented Oxipit’s CE-certified AI solutions, particularly Oxipit ChestLink and Oxipit Quality, to improve workflow efficiency and diagnostic reliability. LUMC serves a regional population of approximately 1,983,700 residents and has an annual workload of 137,000 radiological examinations per year, ~23,000 are specifically chest radiographs [3]. Oxipit is a Lithuanian-based radiology AI company (www.oxipit.ai) and is focused on delivering clinically validated, workflow-centric solutions. Notably, it became the first developer globally to secure CE Class IIb certification for autonomous radiology reporting of normal chest X- radiographs. The company’s approach to AI adoption emphasises full PACS integration, clinician collaboration, and scalability across diverse healthcare environments, from academic hospitals to smaller community sites.
The problem and the solution
Radiology departments across Europe are navigating increasing imaging volumes in parallel with constrained staffing capacities. This imbalance has led to mounting workloads, prolonged turnaround times, and significant stress on diagnostic resources. Among these, chest radiography stands out as a high-volume, low-pathology modality, making it a prime candidate for targeted workflow innovation.
At Leiden University Medical Centre (LUMC), an academic tertiary hospital in the Netherlands, approximately half of all chest radiographs are ultimately deemed normal. This observation catalysed a strategic exploration into AI applications to streamline reporting, reallocate radiologist effort, and uphold diagnostic quality standards.
While AI in radiology has advanced rapidly from experimental algorithms to CE-certified clinical tools, the transformative potential depends significantly on how it is operationalised within real-world settings. This case study outlines the phased implementation, integration, and impact of Oxipit’s AI solutions within LUMC’s radiology department, offering insight into the clinical, technical, and organisational dynamics involved.
Baseline workflow and identified challenges
Prior to AI deployment, chest radiographs were interpreted manually by radiologists or residents, requiring a consistent investment of approximately 3–4 minutes per case. Given the high proportion of normal studies and daily volumes of 80 to 100 exams, a substantial portion of clinical capacity was expended on reviewing unremarkable findings.
Residents, particularly during overnight or weekend shifts, bore the brunt of this task. Limited senior oversight during these periods further heightened the cognitive load and diagnostic pressure on less experienced practitioners. The cumulative effect was workflow inefficiency and increased risk of interpretive fatigue. In addition to personnel fatigue, the lack of case prioritisation meant that complex or urgent studies could be queued alongside routine normal exams. This could have led to a suboptimal allocation of diagnostic expertise and could have contributed to variability in turnaround times, especially for cases requiring expedited attention.
Initial engagement and technical integration
LUMC initiated collaboration with Oxipit in 2018 through the Medical Delta innovation network (https://www.medicaldelta.nl/en). The early focus was on evaluating the performance and feasibility of AI-generated automated chest radiograph reports. However, full diagnostic automation proved premature due to the complexity of integrating relevant clinical context into algorithmic decision-making.
Subsequently, the partnership evolved toward a more pragmatic and clinically tractable use case: automated binary triage (normal vs. abnormal). This shift led to the introduction of Oxipit ChestLink, a CE-certified solution designed to autonomously identify normal chest radiographs with high sensitivity [4].
A critical element in clinical adoption was seamless integration with the hospital’s Picture Archiving and Communication System (PACS). Oxipit’s engineering team collaborated with LUMC IT personnel to embed AI outputs directly into the radiologist’s worklist. This eliminated reliance on external portals and preserved existing clinical workflows, significantly enhancing user acceptance. (see figure 1)

Clinical utility and workflow impact
Through retrospective and prospective validation involving approximately 20,000 chest radiographs, ChestLink demonstrated the capacity to autonomously report 15–20% of studies as normal, with 99.9% sensitivity. This translated into time savings, particularly in high-volume reporting blocks.
From a clinical operations perspective, this reduction in reporting burden facilitated:
- Reallocation of radiologist time to complex or urgent cases
- Reduced cognitive load on night shift residents
- Enhanced throughput even with a decrease in staffing over time
- Prioritisation of cases with likely findings, accelerating diagnostic pathways for higher-risk patients
These changes produced noticeable improvements in departmental efficiency and also contributed to greater consistency in turnaround times. For example, studies triaged as abnormal by the AI system could be surfaced earlier in the worklist, supporting faster intervention.
Parallel implementation of Oxipit Quality, a secondary-read quality assurance tool, further contributed to the quality of radiological services. Occasionally it flagged instances where radiological reports did not align with AI findings, incidentally, prompting several clinically relevant reassessments (including a missed pulmonary nodule and subtle fractures). These real-world interventions strengthened confidence in the system’s reliability.
AI also has the potential to help refine intra-day workflows. It is planned to use this application In the future, during peak hours, whereby staff will be able to strategically batch or defer confirmed normal studies, enabling radiologists to better manage variable caseloads and unexpected surges.
Professional culture and change management
Technological adoption required more than technical validation; it demanded cultural alignment. Initial scepticism among radiologists, particularly regarding autonomy and liability was addressed through structured dialogue and collaborative planning. LUMC’s leadership emphasised transparency, hosted regular briefings, and positioned AI as an augmentation rather than a threat to clinical judgment.
Crucially, concerns were addressed not solely through empirical evidence but by acknowledging the emotional and ethical dimensions of diagnostic responsibility. Radiologists were included in discussions on deployment strategy, incorporation of AI into daily clinical routine including error management and patient safety protocols.
Legal considerations and liability frameworks
Autonomous diagnostic support raises complex questions related to liability, data governance, and clinical oversight. Under Dutch medical law and the European AI act [5], a physician is currently required to validate diagnostic output before it is communicated to patients or referring clinicians. Accordingly, LUMC has not yet transitioned to fully autonomous reporting, though it is actively working with legal and regulatory stakeholders to explore frameworks for shared responsibility.
To that end, LUMC is developing operational protocols in collaboration with clinicians and legal teams and as well as national regulatory bodies. These include models for clinician override, audit logging, and insurance coverage, particularly in edge cases where AI and human interpretations diverge.
Educational applications and AI literacy
The implementation of AI has also yielded benefits for clinical education. There is potential to integrate AI into the curriculum but currently there is no consensus on how best to achieve this. It is possible that AI-flagged studies may be used to improve pattern recognition and develop an understanding of normal anatomical variance. Annotated studies and disagreement cases have become valuable learning tools in teaching rounds.
Furthermore, LUMC has together with the Vrije Universiteit Amsterdam (VU) resurrected a national AI Learning Lab focusing on cross-community learning of algorithmic technologies in radiology, an initiative supported by the Dutch Research Council. This platform allows for shared review of AI-generated cases, collaborative annotation, and cross-institutional benchmarking. Several learning modules were created teaching end-users of AI systems to responsibly use AI results for their own clinical practice using real clinical cases. The goal is to provide responsible, informed adoption of AI tools through clinician-led peer learning.
Patient-centric benefits and communication pathways
One significant, if indirect, outcome of AI integration is accelerated patient communication. For normal cases flagged with high confidence, referring clinicians can potentially receive rapid clearance when automated reporting is used, sometimes before the patient has left the department. This reduction in turnaround time supports timely reassurance and alleviates anxiety for patients awaiting results.
LUMC is exploring standardised communication templates and automated messaging systems to further streamline this aspect. With the ability to view their own data, patients will be able to see their AI results. This means that there needs to be a clear indication which results have been obtained using AI and where human interactions did take place and what changes were made. Future iterations of this workflow could enable automated delivery of normal findings to patients directly, with appropriate safeguards and clinician oversight.
LUMC’s implementation of Oxipit’s AI solutions represents a significant step forward in radiology practice. Our experience shows that with careful implementation and strong stakeholder engagement, AI can successfully enhance radiological services while maintaining high quality standards. The project has laid groundwork for broader regional implementation, and we look forward to working with Oxipit on future autonomous AI initiatives.
Regional strategy and scalable service models
LUMC envisions AI-enabled radiology as a cornerstone of regional diagnostic services. By deploying ChestLink in satellite hospitals where it can assist radiologists with high volume chest X-rays exams. This model ensures consistent quality while enabling radiologists in the satellite hospitals to focus on exams that demand advanced expertise.
Furthermore, early pilots are exploring the feasibility of involving radiographers or physician assistants in preliminary validation of AI-cleared studies. Such role redistribution has the potential to enhance system efficiency without compromising diagnostic integrity.
Future directions: sensitivity tuning and real-time feedback
To further optimise workflow gains, LUMC is experimenting with sensitivity threshold calibration. A slight reduction in sensitivity could substantially increase the number of studies categorised as normal, albeit with a controlled trade-off in specificity. The team is conducting retrospective analyses to model the impact of these adjustments and to establish acceptable clinical risk boundaries.
A complementary area of development involves real-time quality control. Current quality checks are performed post hoc, but integrating immediate feedback mechanisms such as real-time alerts upon report sign-off could enhance safety and responsiveness.
Conclusion
The deployment of Oxipit’s AI solution at LUMC exemplifies how radiology departments can meaningfully incorporate deep learning tools into routine clinical care. Success hinged not merely on algorithmic accuracy, but on thoughtful integration, user engagement, and clear alignment with clinical priorities.
Autonomous AI lets us stop doing tasks we no longer need to do. It’s not about the technology, it’s about solving real problems, improving care, and using our expertise where it matters most.
LUMC’s experience suggests that AI can serve as a viable mechanism to address resource constraints, support clinical training, and maintain diagnostic quality at scale. However, the broader adoption of such systems requires continued dialogue around regulation, professional accountability, and patient-centred design.
By taking a measured, collaborative approach, LUMC offers a replicable model for institutions seeking to navigate the complex landscape of AI in medical imaging.
“Autonomous AI lets us stop doing tasks we no longer need to do. It’s not about the technology, it’s about solving real problems, improving care, and using our expertise where it matters most.”
— Dr. Willem Grootjans, Assistant Professor and Head of Imaging Services, LUMC
References:
1. Clinical Radiology Workforce Census 2023 rcr-census-clinical-radiology-workforce-census-2023.pdf
2. https://www.england.nhs.uk/statistics/wp-content/uploads/sites/2/2022/07/Statistical-Release-21st-July-2022-PDF-875KB.pdf?utm_source=chatgpt.com
3. https://www.england.nhs.uk/statistics/wp-content/uploads/sites/2/2023/11/Annual-Statistical-Release-2022-23-PDF-1.3MB-1.pdf
4. Grootjans W , Krainska U , Rezazade Mehrizi MH. 2025. How do medical institutions co-create artificial intelligence solutions with commercial startups? Eur Radiol. Jun 3. doi: 10.1007/s00330-025-11672-4
5. The European AI act , Document 32024R1689 Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 http://data.europa.eu/eli/reg/2024/1689/oj
