Skip to content

sigbir/awesome

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Awesome SIGBIR reproducibility challenge 2024

** Motivation **

🦄 The importance of image registration in medical imaging is underrepresented at MICCAI, largely due to the complexity of applying deep learning to its ill-posed nature without real ground truth. In response, the MICCAI Special Interest Group in Biomedical Image Registration (MICCAI SIG-BIR) is launching a new initiative aimed at enhancing our recently launched Awesome Medical Image Registration (GitHub repo) that should comprise an update-to-date, curated list of well-documented, out-of-the-box open source code repositories that are applicable to at least two Learn2Reg tasks.

🙋🏽‍♀️ This call for participation is aimed at incentivising additional researchers to make there methods available to the community and consequently support progress in research and later on clinical use.

Guidelines for Participation

Anyone can participate by preparing a new or amending an existing public source code repository on medical image registration that fulfils the following key criteria

🌐 CC-BY licence (or similarly permissive) for code and pre-trained models - i.e. free to use for both academia and industry

🎓 Educational description of how the method works, its strength and weaknesses and how to apply it out-of-the-box on medical registration tasks

🥈 Report reproducible scores on at least two different Learn2Reg tasks and ideally a description how to employ it on new data

🥩 Note that the raw performance is not the deciding factor for being eligible for awards

✉️ Send us an email to [email protected] and create a pull-request on our Github page

** Awards / Prize Money **

🏆 The SIGBIR and MICCAI society kindly agreed to provide up to 5 awards with 150$ each in prize money

🕘 We will evaluate any new submissions on a rolling basis on every 20th of the month (hence submission deadlines in 2024 are October 20th, November 20th, December 20th 23:59 PT). Depending on the number of submissions multiple prizes can be awarded in one month.

🥇 The criteria for ranking competing submissions are: readability of documentation (how easy is it to use the approach), reproducibility (can the published scores be matched), coverage (how many Learn2Reg or free tasks are applicable), performance (how accurate, fast, reliable is the method).

🚀 The challenge runs in 2024 but may extended into 2025.

About

Reproducibility challenge 2024

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published