The Fast, Low-resource, Accurate, Robust, and Effectual Medical Image Analysis or FLARE Challenge pushes the boundaries of medical image analysis by carefully designed benchmarks, ranging from pan-cancer segmentation to multimodal models, giving participants the chance to tackle real-world challenges in accuracy and efficiency. Now in its fifth year, the challenge introduces six innovative tasks:
- Pan-cancer segmentation: develop robust algorithms capable of segmenting lesions across multiple disease types in CT scans
- Abdominal CT Organ Segmentation on Laptop: develop resource-efficient methods that can perform accurate segmentation on computationally limited devices, promoting accessibility in low-resource settings.
- Unsupervised Domain Adaptation for Abdominal Organ Segmentation in MRI and PET Scans: develop adaptation methods from CT to unseen modalities (i.e., MRI and PET) without extensive labeled data, fostering generalizability and scalability.
- Foundation Model for 3D medical images: develop versatile, pre-trained models that can serve as a basis for a variety of downstream diagnosis tasks in 3D medical imaging.
- Multimodal Model for Medical Image Parsing: develop generalist vision-language models that can handle classification, detection, counting, measuring, and regression tasks for various image modalities.
- Agentic System for Medical Image Analysis: develop large language model-based systems capable of interacting with users to call medical image analysis tools to solve various medical image analysis tasks.
FLARE 2025 is taking place this afternoon from 1:30 - 6:00 in DCC2, 3F-301. We asked organizers Jun Ma, Postdoctoral Researcher, and Bo Wang, Associate Professor - both at the University of Toronto and pictured here - to tell us about their plans for the FLARE challenge.
Jun Ma
Postdoctoral Researcher
University of Toronto
Bo Wang
Associate Professor
University of Toronto
Q: How has the Challenge evolved since it was established?
The FLARE challenge was first established in 2021, under the name Fast and Low-GPU-Memory Abdominal Organ Segmentation (FLARE). Its aim was to benchmark segmentation methods on abdominal CT scans that prioritized fast inference and low GPU memory consumption.
Since its inception, FLARE has consistently focused on addressing critical needs within medical image analysis, particularly emphasizing solutions that are efficient, reliable, and practical for real-world clinical applications. While the core themes of speed, low-resource utilization, accuracy, robustness, and effectuality have remained central, the specific applications and datasets have evolved each year to reflect cutting-edge research areas.
- FLARE 2021: Focused on segmenting four abdominal organs (liver, kidney, spleen, pancreas) in CT scans in a fully supervised setting, using a dataset of 511 CT scans.
- FLARE 2022: Extended to semi-supervised learning, using unlabeled data to segment 13 abdominal organs in CT scans, with a dataset four times larger (2,300 CT scans).
- FLARE 2023: Introduced pan-cancer segmentation alongside 13 abdominal organs in CT scans, with a dataset of 4,500 CT scans—the first challenge for pan-cancer segmentation in abdominal CT scans.
- FLARE 2024: Expanded to three tasks: pan-cancer segmentation in CT scans (extending to whole-body cancer, 10,000+ CT scans), abdominal CT organ segmentation on laptops (low-resource deployment without GPUs), and unsupervised domain adaptation for abdominal organ segmentation in MRI scans.
- FLARE 2025: Now addresses critical clinical needs and technical challenges in disease diagnosis and the development of foundational and multimodal AI systems. It introduces six subtasks, including pan-cancer segmentation, unsupervised domain adaptation for abdominal organ segmentation in MRI and PET scans, foundation models for 3D CT and MRI, multimodal models, and agentic systems for medical image analysis. This year focuses especially on generalizability across imaging modalities.
Q: How do you determine what the challenge's focus will be?
The focus is determined by a combination of factors:
- Clinical and technical needs: Addressing real-world challenges in medical image analysis, especially the development of models that are fast, resource-efficient, accurate, and robust across diverse clinical and computational settings.
- Data scale and diversity: Each year introduces larger, more heterogeneous datasets across centers, imaging modalities, and pathologies.
- Progressive complexity: Evolving from fully supervised to semi-supervised, partial-label, unsupervised domain adaptation, and now foundation models.
- Pushing research boundaries: Incorporating emerging topics such as multi-modal models to reflect the latest trends in medical image analysis.
Q: Where do you see the Challenge evolving?
The FLARE Challenge is evolving towards more comprehensive, generalizable, and clinically applicable AI systems for medical image analysis. Key areas include:
- Multimodal and foundation models: Continuing to develop versatile models that work across CT, MRI, PET, and beyond, serving as a foundation for many clinical applications.
- Clinically integrated AI: Moving toward systems that can be seamlessly embedded into workflows—potentially agent-like systems that assist clinicians in real time with decision-making.
Q: To wrap things up, what else would you like us to know about the challenge or the organizers?
A few points are worth highlighting:
- Efficiency: A core tenet of the FLARE challenge is computational efficiency—fast inference and low computing consumption—critical for real-world clinical deployment.
- Datasets: FLARE continuously collects and shares large, diverse datasets, providing invaluable resources to the medical imaging community.
- Open science: The challenge promotes reproducible research by requiring top-ranked teams to publicly release their complete codebases, fostering collaboration and innovation.
- Leadership: FLARE is led by renowned researchers Jun Ma and Bo Wang (University of Toronto, University Health Network, and the Vector Institute), supported by a dedicated team of coordinators.