Skip to content

MIC-DKFZ/mama-mia

Repository files navigation

mama-mia-challenge

Copyright German Cancer Research Center (DKFZ) and contributors. Please make sure that your usage of this code is in compliance with its license.

This repository contains all code and instructions for our participation in the Mama-Mia Challenge.

Please cite the following paper when using this code:

Kächele, J., Bounias, D., Ertl, A., & Maier-Hein, K. (2025). On Tackling Domain Shift in Breast MRI Using Only Publicly-Available Data: Reproducible Breast Cancer Segmentation and pCR Prediction. In Lecture Notes in Computer Science (pp. 310–319). Springer Nature Switzerland. https://doi.org/10.1007/978-3-032-05559-0_31

Segmentation

1. Self-Supervised Pretraining

  1. Clone and set up the nnU-Net SSL repo https://github.com/MIC-DKFZ/nnssl
  2. Plan and Preprocess
    nnssl_plan_and_preprocess \
    -d <PRETRAIN_DATASET_ID>
  3. Train
    nnssl_train \
    -d <PRETRAIN_DATASET_ID> \
    -tr BaseMAETrainer_BS2 \
    -p nnsslPlans \
    --num_gpus 1

2. Finetuning on Mama-Mia

  1. Clone and set up the nnU-Net adaption for fine-tuning repo https://github.com/TaWald/nnUNet
  2. Preprocess Mama-Mia data
    nnUNetv2_preprocess_like_nnssl \
    -d <MAMAMIA_DATASET_ID> \
    -n <TRAINING_NAME> \
    -pc <PATH/TO/PRETRAINED_CHECKPOINT.pth> \
    -am like_pretrained
  3. Fine-Tune
    nnUNetv2_train_pretrained \
    -p <PLANS_IDENTIFIER> \
    <MAMAMIA_DATASET_ID> \
    <CONFIGURATION> \
    <FOLD>

Classification

For each script a description of the arguments and usage can be found with -h.

  • Feature extraction extract_features.py (can be used also with params of training)
  • Feature normalization preprocess_features.py
  • Fit a model on training data model_fit.py (many configuration can be found under config/)
  • Predict on unseen data with model_pred.py
  • end_to_end_multi_prob.py is what is used in the submission

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors