Download PDFOpen PDF in browser

From Unlabeled Data to Clinical Applications: Foundation Models in Medical Imaging

4 pagesPublished: January 5, 2026

Abstract

The performance of deep learning algorithms is highly dependent on the quantity and diversity of the available training data. However, obtaining sufficiently large datasets represents a significant challenge, particularly in the field of medical imaging. This study underscores the potential of self-supervised training strategies in the development of deep learning models for medical imaging tasks. It is demonstrated that workflows can be significantly optimized by incorporating the feature content of a large collection of medical X-ray images from intraoperative C-arm scans into a so-called foundation model. This approach facilitates the efficient adaptation to a variety of concrete applications by fine-tuning a small task-specific head network on top of the pre-trained foundation model, thereby reducing both computational demands and training time.

Keyphrases: artificial intelligence, foundation models, machine learning, medical imaging

In: Joshua William Giles and Aziliz Guezou-Philippe (editors). Proceedings of The 25th Annual Meeting of the International Society for Computer Assisted Orthopaedic Surgery, vol 8, pages 152-155.

BibTeX entry
@inproceedings{CAOS2025:From_Unlabeled_Data_Clinical,
  author    = {Joshua Scheuplein and Maximilian Rohleder and Björn Kreher and Andreas Maier},
  title     = {From Unlabeled Data to Clinical Applications: Foundation Models in Medical Imaging},
  booktitle = {Proceedings of The 25th Annual Meeting of the International Society for Computer Assisted Orthopaedic Surgery},
  editor    = {Joshua William Giles and Aziliz Guezou-Philippe},
  series    = {EPiC Series in Health Sciences},
  volume    = {8},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-5305},
  url       = {/publications/paper/mBlQ},
  doi       = {10.29007/h2m6},
  pages     = {152-155},
  year      = {2026}}
Download PDFOpen PDF in browser