NeurIPS 2026 Call for Competitions
We are delighted to announce the tenth edition of the competition track at NeurIPS 2026. Through this call, we solicit competition proposals on topics of interest to the NeurIPS community.
We especially encourage submissions with a clear scientific question and from fields with positive societal impact, particularly those leveraging AI to support disadvantaged communities. We also welcome proposals focusing on the evaluation of large language models and agentic systems. Besides societal impact, we seek proposals in areas where machine learning can positively advance other scientific, technological, or business domains of relevance to the NeurIPS community. We hope that interesting competitions will attract a significant cross-section of interdisciplinary and diverse communities to our NeurIPS 2026 competition track.
NeurIPS 2026 will be hosting a Competition Track, with a dedicated workshop for each accepted competition, where results will be presented and discussed by participants and organizers. The Competition Track Workshops will be held in person.
Authors are asked to confirm that their submissions accord with the NeurIPS Code of Conduct and Code of Ethics.The Competition Track adopts the NeurIPS Main Track Handbook for all general policies (including plagiarism, use of agents and large language models, and anti-collusion), except where Competition Track–specific rules are explicitly stated (e.g., page limits, single-blind review).
PROPOSAL SUBMISSION
Competition proposals must be submitted via OpenReview at:
https://openreview.net/group?id=NeurIPS.cc/2026/Competition_Track
Note: Newly created profiles in OpenReview can take up to two weeks for approval. Please plan accordingly.
Please carefully follow the following LaTeX template when preparing proposals:
https://www.overleaf.com/read/ntmnhczyyxrg#360d0b
Note that the guidelines and template have been updated from last year.
The main text of the submitted proposal is limited to eight content pages, including all figures and tables. Additional pages containing the team members’ biographies and references do not count as content pages.
REVIEWING AND SELECTION PROCESS
A competition program committee will be formed, consisting of experts on both machine learning and challenges organization. Each proposal will be reviewed by members of the program committee. The factors that will be considered when evaluating proposals include:
- Scientific Relevance and Task(s): Competitions should be motivated by clear scientific questions and curiosity. Impact, originality, and relevance to the NeurIPS community will all be considered. Tasks with humanitarian and/or positive societal impact are highly encouraged, and challenges focusing on the evaluation of large language models and agentic systems are also encouraged, although other topics relevant to the NeurIPS community are also welcome. If humanitarian projects are submitted, the involvement of the community in question is strongly desired, and “parachute science” is strongly discouraged. Ethical considerations, including potential risks associated with the task and dataset, should be carefully addressed.
- Evaluation Protocol: Feasibility of the task, baseline availability, soundness and suitability of the evaluation criteria, and clarity and fairness of the competition rules will all be evaluated. Proposals should also ensure the evaluation aligns with real-world task performance and the scientific question. To ensure a fair competition, organizers should specify appropriate constraints or provide standardized environments that level the playing field, particularly for less well-resourced research groups.
- Data: If the competition is data-driven, proposals should clearly describe the dataset(s), including data quality, sample size, and representativeness with respect to the scientific question, as well as the sufficiency of data for training and testing algorithms. Proposals should also address potential feature leakage. The data must be freely available, and releasing it as open data is encouraged.
- Logistics: The competition schedule, plan for attracting competition participants, and experience and diversity of the organizers will all be considered. The specific plan for attracting competition participants, including groups underrepresented at NeurIPS, will be important during the review process.
The review process is single-blind, so the submissions should include the organizers’ identities (i.e., not anonymized).
PROCEEDINGS
Similar to last year, there are two options for accepted competitions in 2026 to submit their post-competition analyses:
- As papers for the 2027 NeurIPS Evaluations & Datasets Track (next year), which involves a standard paper review process without a guarantee of acceptance.
- Submit to a PMLR volume (e.g., https://proceedings.mlr.press/v220/) dedicated to NeurIPS competitions, where acceptance is guaranteed after a light-weight review.
IMPORTANT DATES (anywhere on earth)
- Competition proposal submission deadline: May 15, 2026
- Acceptance notification: June 15, 2026
- NeurIPS Competition track: December 11-12, 2026
ADDITIONAL COMMENTS
Competition organizers should propose a timeline for running the competition to ensure participants have enough time to contribute high-quality entries. It is recommended that competitions be launched in June and completed by the end of October 2026 at the absolute latest.
Competition organizers are expected to provide brief, regular updates to the competition chairs to ensure that the competition remains on track.
Competition organizers that require help or suggestions regarding competition platforms for running the competition can contact the competition chairs for advice.
- 2025 accepted competitions
- 2024 accepted competitions
- 2023 accepted competitions
- 2022 accepted competitions
- 2021 accepted competitions
- 2020 accepted competitions
- 2019 accepted competitions
- 2018 accepted competitions
- 2017 accepted competitions