Skip to yearly menu bar Skip to main content


Search All 2024 Events
 

53 Results

<<   <   Page 2 of 5   >   >>
Poster
Thu 11:00 FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion
Xing Han · Huy Nguyen · Carl Harris · Nhat Ho · Suchi Saria
Poster
Fri 16:30 CuMo: Scaling Multimodal LLM with Co-Upcycled Mixture-of-Experts
Jiachen Li · Xinyao Wang · Sijie Zhu · Chia-Wen Kuo · Lu XU · Fan Chen · Jitesh Jain · Humphrey Shi · Longyin Wen
Workshop
Sat 11:54 OLMoE: Open Mixture-of-Experts Language Models
Niklas Muennighoff · Luca Soldaini · Dirk Groeneveld · Kyle Lo · Jacob Morrison · Sewon Min · Weijia Shi · Evan Walsh · Oyvind Tafjord · Nathan Lambert · Yuling Gu · Shane Arora · Akshita Bhagia · Dustin Schwenk · David Wadden · Alexander Wettig · Binyuan Hui · Tim Dettmers · Douwe Kiela · Noah Smith · Pang Wei Koh · Amanpreet Singh · Hannaneh Hajishirzi
Workshop
Multi-View Mixture-of-Experts for Predicting Molecular Properties Using SMILES, SELFIES, and Graph-Based Representations
Eduardo Soares · Indra Priyadarsini S · Emilio Vital Brazil · Victor Yukio Shirasuna · Seiji Takeda
Workshop
Multi-View Mixture-of-Experts for Predicting Molecular Properties Using SMILES, SELFIES, and Graph-Based Representations
Eduardo Soares · Indra Priyadarsini S · Emilio Vital Brazil · Victor Yukio Shirasuna · Seiji Takeda
Workshop
SciDFM: A Large Language Model with Mixture-of-Experts for Science
Liangtai Sun · Danyu Luo · Da Ma · Zihan Zhao · BaocaiChen · Zhennan Shen · Su Zhu · Lu Chen · Xin Chen · Kai Yu
Workshop
Gradient-free variational learning with conditional mixture networks
Conor Heins · Hao Wu · Dimitrije Markovic · Alexander Tschantz · Jeff Beck · Christopher L Buckley
Poster
Wed 16:30 MoE Jetpack: From Dense Checkpoints to Adaptive Mixture of Experts for Vision Tasks
Xingkui Zhu · Yiran Guan · Dingkang Liang · Yuchao Chen · Yuliang Liu · Xiang Bai
Workshop
MOTIFNet: Automating the Analysis of Amphiphile and Block Polymer Self-Assembly from SAXS Data
Daoyuan Li · Shuquan Cui · Mahesh Mahanthappa · Frank Bates · Timothy Lodge · Joern Ilja Siepmann
Poster
Thu 16:30 Mixture of Tokens: Continuous MoE through Cross-Example Aggregation
Szymon Antoniak · Michał Krutul · Maciej Pióro · Jakub Krajewski · Jan Ludziejewski · Kamil Ciebiera · Krystian Król · Tomasz Odrzygóźdź · Marek Cygan · Sebastian Jaszczur
Poster
Fri 11:00 MomentumSMoE: Integrating Momentum into Sparse Mixture of Experts
Rachel S.Y. Teo · Tan Nguyen
Workshop
Buffer Overflow in Mixture of Experts
Jamie Hayes · I Shumailov · Itay Yona