root/sudo commands can only be used at the time of building the docker. Thus, you need to prepare the locations of the folders with the right permissions when building the docker, so that they can write to it later.
For example, you can add in your
## Add folder for intermediate results, logs etc
RUN mkdir /myhome/
RUN chmod 777 /myhome
RUN export HOME=
And then write all your logs, temporary files, intermediate outputs, etc in /myhome.
Instructions For Submission
To have access to the evaluation phase and be eligible for the challenge prizes, participants are required to:
- Submit a short paper describing their method (deadline: 13th August 2021)
- Containerise their algorithm using Docker (deadline: 13th August 2021)
Participants will also have the opportunity to submit a long paper with the results on the validation and testing sets (deadline: 30th November 2021) published as part of the BrainLes workshop proceedings distributed by LNCS.
1. Short Paper Instructions
Participants will have to evaluate their methods on the validation set and submit their short paper (4-5 LNCS pages) to the BrainLes CMT submission system and choose crossMoDA as the "Track". This unified scheme should allow for appropriate preliminary comparisons.
The short papers must describe their segmentation algorithm and present the validation results. Specifically, the submissions must include:
- a clear description of the mathematical setting, algorithm, and/or mode for reproducibility purposes.
- the appropriate citations mentioned at the bottom of the Data page.
- the range of hyper-parameters considered, the method used to select the best hyper-parameter configuration, and the specification of all hyper-parameters used to generate results.
- a description of results obtained on the validation leaderboard for the two structures (VS and Cochlea) with mean and standard deviation for the two metrics (Dice score and ASSD).
Paper submissions should use the LNCS template, available both in LaTeX and in MS Word format, directly from Springer (link here).
After receiving the reviewers' feedbacks, participants will be allowed to submit their methods on open-access platforms (e.g., ArXiv).
Later, participants will have the opportunity to submit longer papers to the MICCAI 2021 BrainLes Workshop post-proceedings. crossMoDA papers will be part of the BrainLes workshop proceedings distributed by LNCS.
2. Docker Container Instructions
The test set won't be released to the challenge participants. For this reason, participants must containerise their methods with Docker and submit their docker container for evaluation. Your code won't be shared and will be only used internally by the crossMoDA organisers.
Docker allows for running an algorithm in an isolated environment called a container. In particular, this container will locally replicate your pipeline requirements and execute your inference script.
Design your inference script
The inference will be automatically performed using Docker. More specifically, a command will be executed when your Docker container is run (example: `
command must run the inference on the test set, i.e. predict the
segmentation for each test hrT2 scan. The test set will be mounted into
/input and the predictions must be written in
/output. The folder
/input will contain all the test hrT2 scans with the format
. The participant script must write each prediction using the format
For example, the prediction for the file
We provide a script example here.
Create your Docker Container
Docker is commonly used to encapsulate algorithms and their dependencies. In this section we list four steps you will have to follow in order create your docker image so that it is ready for submission.
Dockerfile.Detailed explanations are provided here.
Please look at the crossMoDA Docker container example on Github.
In a nutshell,
Dockerfile allows for:
- Pulling a pre-existing image with an operating system and, if needed, CUDA (FROM instruction).
- Installing additional dependencies (RUN instructions).
- Transferring local files into your Docker image (COPY instructions).
- Executing your algorithm (CMD and ENTRYPOINT instructions).
## Pull from existing image
## Copy requirements
COPY ./requirements.txt .
## Install Python packages in Docker image
RUN pip3 install -r requirements.txt
## Copy all files
COPY ./ ./
## Execute the inference commandCMD ["./src/run_inference.py"]
Thirdly, you can build your docker image:
docker build -f Dockerfile -t
your image name] .
Fourthly, you will upload your image to Docker Hub. Instructions can be found here:
your image name]
Your container will be run with the following command:
-v [output directory]:/output
your image name]
[input directory] will be the absolute path of our directory containing the test set, [output directory] will be the absolute path of the prediction directory and [your image name] is the name of your Docker image.
Test your Docker container
To test your docker container, you will have to run your Docker container and perform inference using the validation set.
Firstly, download the validation set zip here.
Secondly, unzip the validation set in [unzip validation set] and run:
unzip validation set]:/input/:ro
-v [output directory]:/output
your image name]
Thirdly, check that the predictions are correct.
Fourthly, please zip [output directory] and include the .zip in your CMT submission.
All the information regarding your Docker container (Docker Hub image name and validation predictions in a .zip file) will have to be included in your submission via the BrainLes CMT system (crossMoDA track).
We will also ask you to specify the requirements for running your algorithm: number of CPUs, amount of RAM, amount of GPU memory (32GB max) and estimated computation time per subject.
Q: Should DockerHub Repository visibility be
set to Public?
A: The easiest option is to set the visibility to Public for at least the three weeks following the deadline (Friday 13 August).
Q: I need to add "--runtime=nvidia" to use a GPU in the Docker container. Is that ok?
A: Yes, this is absolutely fine.
Q: Which GPU should I use to run the inference in the Docker container?
A: Please use: "CUDA_VISIBLE_DEVICE=0"
3. Long Paper Instructions