Hasty Briefsbeta

Bilingual

Understanding pre-training data effects in retinal foundation models using two large fundus cohorts - PubMed

5 hours ago
  • #foundation models
  • #medical AI
  • #retinal imaging
  • Medical foundation models pre-trained on large-scale unlabelled data show strong performance and data efficiency in clinical applications.
  • The study uses two large cohorts from Moorfields Eye Hospital (UK) and the Shanghai Diabetes Prevention Program (China), each with 904,170 fundus photographs for pre-training.
  • Parallel foundation models trained on individual cohorts show competitive performance on downstream tasks, even with data differing from pre-training data.
  • Fairness gaps are observed over age subgroups, while sex and ethnicity show minimal impact on model performance.
  • The results highlight the importance of domain-specific, fine-grained data curation for developing efficient foundation models.
  • The study demonstrates the good generalisability of retinal foundation models and the varying impact of pre-training demographic attributes on fairness.