Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Fri Dec 07 05:00 AM -- 03:30 PM (PST) @ Room 514
2nd Workshop on Machine Learning on the Phone and other Consumer Devices (MLPCD 2)
Sujith Ravi · Wei Chai · Yangqing Jia · Hrishikesh Aradhye · Prateek Jain





Workshop Home Page

The 2nd Workshop on Machine Learning on the Phone and other Consumer Devices (MLPCD 2) aims to continue the success of the 1st MLPCD workshop held at NIPS 2017 in Long Beach, CA.

Previously, the first MLPCD workshop edition, held at NIPS 2017 was successful, attracted over 200+ attendees and led to active research & panel discussions as well as follow-up contributions to the open-source community (e.g., release of new inference libraries, tools, models and standardized representations of deep learning models). We believe that interest in this space is only going to increase, and we hope that the workshop plays the role of an influential catalyst to foster research and collaboration in this nascent community.

After the first workshop where we investigated initial directions and trends, the NIPS 2018 MLPCD workshop focuses on theory and practical applications of on-device machine learning, an area that is highly relevant and specializes in the intersection of multiple topics of interest to NIPS and broader machine learning community -- efficient training & inference for deep learning and other machine learning models; interdisciplinary mobile applications involving vision, language & speech understanding; and emerging topics like Internet of Things.

We plan to incorporate several new additions this year -- inspirational opening Keynote talk on "future of intelligent assistive & wearable experiences"; two panels including a lively closing panel debate discussing pros/cons of two key ML computing paradigms (Cloud vs. On-device); solicited research papers on new & recent hot topics (e.g., theoretical & algorithmic work on low-precision models, compression, sparsity, etc. for training and inference), related challenges, applications and recent trends; demo session showcasing ML in action for real-world apps.


Description & Topics:

Deep learning and machine learning, in general, has changed the computing paradigm. Products of today are built with machine intelligence as a central attribute, and consumers are beginning to expect near-human interaction with the appliances they use. However, much of the Deep Learning revolution has been limited to the cloud, enabled by popular toolkits such as Caffe, TensorFlow, and MxNet, and by specialized hardware such as TPUs. In comparison, mobile devices until recently were just not fast enough, there were limited developer tools, and there were limited use cases that required on-device machine learning. That has recently started to change, with the advances in real-time computer vision and spoken language understanding driving real innovation in intelligent mobile applications. Several mobile-optimized neural network libraries were recently announced (CoreML, Caffe2 for mobile, TensorFlow Lite), which aim to dramatically reduce the barrier to entry for mobile machine learning. Innovation and competition at the silicon layer has enabled new possibilities for hardware acceleration. To make things even better, mobile-optimized versions of several state-of-the-art benchmark models were recently open sourced. Widespread increase in availability of connected “smart” appliances for consumers and IoT platforms for industrial use cases means that there is an ever-expanding surface area for mobile intelligence and ambient devices in homes. All of these advances in combination imply that we are likely at the cusp of a rapid increase in research interest in on-device machine learning, and in particular, on-device neural computing.

Significant research challenges remain, however. Mobile devices are even more personal than “personal computers” were. Enabling machine learning while simultaneously preserving user trust requires ongoing advances in the research of differential privacy and federated learning techniques. On-device ML has to keep model size and power usage low while simultaneously optimizing for accuracy. There are a few exciting novel approaches recently developed in mobile optimization of neural networks. Lastly, the newly prevalent use of camera and voice as interaction models has fueled exciting research towards neural techniques for image and speech/language understanding. This is an area that is highly relevant to multiple topics of interest to NIPS -- e.g., core topics like machine learning & efficient inference and interdisciplinary applications involving vision, language & speech understanding as well as emerging area (namely, Internet of Things).

With this emerging interest as well as the wealth of challenging research problems in mind, we are proposing the second NIPS 2018 workshop dedicated to on-device machine learning for mobile and ambient home consumer devices.


Areas/topics of interest include, but not limited to:

* Model compression for efficient inference with deep networks and other ML models
* Privacy preserving machine learning
* Low-precision training/inference & Hardware acceleration of neural computing on mobile devices
* Real-time mobile computer vision
* Language understanding and conversational assistants on mobile devices
* Speech recognition on mobile and smart home devices
* Machine intelligence for mobile gaming
* ML for mobile health other real-time prediction scenarios
* ML for on-device applications in the automotive industry (e.g., computer vision for self-driving cars)
* Software libraries (including open-source) optimized for on-device ML


Target Audience:

The next wave of ML applications will have significant processing on mobile and ambient devices. Some immediate examples of these are single-image classification, depth estimation, object recognition and segmentation running on-device for creative effects, or on-device recommender and ranking systems for privacy-preserving, low-latency experiences. This workshop will bring ML practitioners up to speed on the latest trends for on-device applications of ML, offer an overview of the latest HW and SW framework developments, and champion active research towards hard technical challenges emerging in this nascent area. The target audience for the workshop is both industrial and academic researchers and practitioners of on-device, native machine learning. The workshop will cover both “informational” and “aspirational” aspects of this emerging research area for delivering ground-breaking experiences on real-world products.

Given the relevance of the topic, target audience (mix of industry + academia & related parties) as well as the timing (confluence of research ideas + practical implementations both in industry as well as through publicly available toolkits ), we feel that NIPS 2018 would continue to be a great venue for this workshop.

Opening (Chairs) (Opening)
Aurélien Bellet (Contributed talk)
Neel Guha (Contributed talk)
Prof. Kurt Keutzer (Invited talk)
Ting-Wu Chin (Contributed talk)
Prof. Thad Starner (Keynote talk)
Coffee break (morning) (Break)
Prof. Max Welling (Invited talk)
Zornitsa Kozareva (Contributed talk)
Spotlight (poster, demo), Lunch & Poster Session (Spotlight & Poster)
Brendan McMahan (Invited talk)
Prof. Virginia Smith (Invited talk)
Meghan Cowan (Contributed talk)
Kuan Wang (Contributed talk)
Coffee break (afternoon) (Break)
Jan Kautz (Invited talk)
Prof. Song Han (Invited talk)
Demo session (Demo)