Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Mon Dec 13 04:00 AM -- 11:10 AM (PST)
The pre-registration workshop: an alternative publication model for machine learning research
Samuel Albanie · João Henriques · Luca Bertinetto · Alex Hernandez-Garcia · Hazel Doughty · Gul Varol





Workshop Home Page

Machine learning research has benefited considerably from the adoption of standardised public benchmarks. While the importance of these benchmarks is undisputed, we argue against the current incentive system and its heavy reliance upon performance as a proxy for scientific progress. The status quo incentivises researchers to “beat the state of the art”, potentially at the expense of deep scientific understanding and rigorous experimental design. Since typically only positive results are rewarded, the negative results inevitably encountered during research are often omitted, allowing many other groups to unknowingly and wastefully repeat these negative findings.

Pre-registration is a publishing and reviewing model that aims to address these issues by changing the incentive system. A pre-registered paper is a regular paper that is submitted for peer-review without any experimental results, describing instead an experimental protocol to be followed after the paper is accepted. This implies that it is important for the authors to make compelling arguments from theory or past published evidence. As for reviewers, they must assess these arguments together with the quality of the experimental design, rather than comparing numeric results. While pre-registration has been highly adopted in fields such as medicine and psychology, there is little such experience inthe machine learning community. In this workshop, we propose to conduct a full pre-registration review-cycle for machine learning. Our proposal follows an initial small-scale trial of pre-registration in computer vision (Henriques et al., 2019) and builds on a successful pilot study in pre-registration at NeurIPS 2020 (Bertinetto et al., 2020). We have already received a number of requests to repeat the workshop, indicating strong community interest.

Opening remarks (Talk)
Invited Talk - Sarahanne Field (Talk)
PCA Retargeting: Encoding Linear Shape Models as Convolutional Mesh Autoencoders - Eimear O'Sullivan (Talk)
Spotlights 1 (5 x 3 minutes) (Short videos)
Unsupervised Resource Allocation with Graph Neural Networks - Miles Cranmer (Talk)
Break
Invited Talk - Dima Damen (Talk)
Invited Talk - Hugo Larochelle (Talk)
Spotlights 2 (5 x 3 minutes) (Short videos)
Poster Session (Virtual posters)
Break
Invited Talk - Paul Smaldino (Talk)
Confronting Domain Shift in Trained Neural Networks - Carianne Martinez (Talk)
Discussion Panel - 2020 authors' experience (Discussion Panel)
Open Discussion
Closing Remarks