Reproducibility Challenge
Welcome to the 2023 Climate Informatics Reproducibility Challenge!
We provide below the key info of the challenge.
📌 For further details visit the official website at this link.
Registration
Registration is now open for participants (teams) and reviewers at this link (go to Application form) through 22 April @ 23:59 28 April @ 23:59 AOE and 13th May @ 23:59 AOE, respectively. Read on to learn more about the challenge!
Objectives
This year’s challenge aims to build community, facilitate collaboration, and advance open science within the Climate Informatics community, continuing the tradition of an annual Climate Informatics conference-associated hackathon whilst drawing inspiration from The ML Reproducibility Challenge.
Overview
In this year’s challenge, teams of 2-4 will collaborate to create a notebook which reproduces the key contributions of a published environmental data science paper for eventual integration in the open-source Environmental Data Science (EDS) Book. Over the course of a month, teams will locate the data and code associated with their chosen paper; train, validate, and test the models used in the paper; visualise key results from these experiments; discuss their work with peer reviewer(s); and ultimately weave together a narrative illuminating the value of open science which culminates in a citeable, DOI-tagged notebook. Teams will further have the opportunity to network and exchange technical Q&A with fellow participants in weekly drop-in socials throughout the competition. Check out this notebook in the EDS book which reproduces IceNet (Andersson, 2021) for an example of what you’ll be working towards!
Tracks
Teams may choose to join the challenge through one of two tracks:
- Pre-Approved Paper Track: Reproduce a paper pre-approved by the Organising Committee to have sufficient data and code availability. Pre-approved papers are listed in alphabetical order here.
- Participant-Suggested Paper Track: Suggest a paper to reproduce, subject to approval by the Organising Committee. This mechanism is designed to ensure teams choose papers which are realistic to reproduce in a month.
Peer-Review
All notebooks within the EDS book are peer-reviewed to maintain a high standard of scientific integrity. A pool of peer reviewers consisting of original authors of pre-approved papers, reviewers for the Climate Informatics conference, and other members of the Climate Informatics and EDS book communities will begin working with participants at the two-week mark of the competition to sharpen the submission and ensure it is ready for publication in the EDS book following the competition.
Judging
Submissions to both tracks will be judged together by the following panel of environmental data science experts on the basis of their adherence to open science and FAIR practices, adherence to the objectives of the EDS book, and tutorial contributions, adjusted to consider the amount of resources provided by the authors of the original paper.
- Alejandro Coca-Castro, The Alan Turing Institute
- Anne Fouilloux, Simula Research Laboratory
- Douglas (Yuhan) Rao, North Carolina Institute for Climate Studies
Prizes
Each member of the winning team will receive a free book of their choosing (up to £500 in value split across the winning team) published by Cambridge University Press. Note that all participants who complete the challenge will have authorship on a citeable, DOI-tagged notebook within the Environmental Data Science (EDS) Book for listing on their CV.
Conference
Registration for the challenge is free and does not require registration to the 12th International Conference on Climate Informatics held in Cambridge from 19-21 April, though we do encourage researchers interested in the topics covered by the conference to attend in-person or virtually. Registration for the conference is open through 3 April here.
Summary
Too long; didn’t read? Here is the TL;DR on the 2023 Climate Informatics Reproducibility Challenge:
- Teams of 2-4
- Reproducing an existing environmental data science paper as a notebook
- Submissions to be judged after one month
- Winning team to receive Cambridge University Press book of their choosing
- Authorship on a citeable, DOI-tagged notebook to be integrated within the Environmental Data Science (EDS) Book
- Great networking and community-building opportunity
Dates
Key dates for the 2023 Climate Informatics Reproducibility Challenge:
- Challenge Registration Closes (Participants):
22 April 23:59 AOE28 April @ 23:59 AOE - Challenge Registration Closes (Reviewers): 13 May @ 23:59 AOE
- Teams & Project Assignments Announced: 30 April
- Challenge Begins: 1 May @ 00:00 AOE
- Peer Review Begins: 15 May @ 00:00 AOE
- Challenge Ends: 31 May @ 23:59 AOE
- Results Announced: 15 June
- Submissions Integrated into EDS Book: Throughout summer
Organising Committee
The 2023 Climate Informatics Reproducibility Challenge Organising Committee warmly welcomes your participation.
- Andrew McDonald, University of Cambridge and British Antarctic Survey
- Alejandro Coca-Castro, The Alan Turing Institute
- Anne Fouilloux, Simula Research Laboratory
- Andrew Hyde, Cambridge University Press
Contact
Please direct any questions to Andrew McDonald at arm99@cam.ac.uk. Follow us on Twitter @Climformatics for updates as the conference and competition approaches!
Acknowledgments 🙌
2023 Climate Informatics Reproducibility Challenge is co-hosted by EDS book, Climate Informatics and Cambridge University Press & Assessment with support from Cambridge University, The Alan Turing Institute and Simula Research Laboratory.
The computing and storage resources used for this challenge have been kindly provided by CESNET in the context of the C-SCALE project that provides advanced computing for EOSC.
The C-SCALE project (Copernicus – eoSC AnaLytics Engine) has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 101017529. |
---|