NIPS 2018 Expo Demo

Dec. 2, 2018

Expo Schedule »

ML on Resource Constrained Edge Devices – GesturePod!

Sponsor: Microsoft

Organizers:
Shishir G. Patil (Microsoft Research), Don Kurian Dennis (Microsoft Research), Harsha Vardhan Simhadri (Microsoft Research), Prateek Jain (Microsoft Research)

Presenters:
Shishir G. Patil (Microsoft Research), Don Kurian Dennis (Microsoft Research), Harsha Vardhan Simhadri (Microsoft Research)

https://1drv.ms/u/s!AjDloPaG_l0Et7Ikid1voOVFuI116Q
Abstract:

Prediction on edge devices is important in many scenarios for 3 reasons: 1. Latency constraints 2. Network and Bandwidth constraints and 3. Data Privacy. For example, IoT deployments in remote areas (like farmlands) may be constrained by network availability. In wearable electronics, we may not be too happy about transmitting our health information to the cloud. Existing ML models, especially deep learning ones, are too expensive for predictions on the edge, and thus, typically, data is uploaded to the cloud for prediction. In this demonstration, we deploy ML models on edge devices for an exciting application in the accessibility domain.

We present GesturePod--a robust, real-time gesture recognition device. GesturePod can detect natural gestures on inexpensive, lightweight microcontrollers. GesturePod can be used with any white cane to aid people with visual impairments(VI) access their phone easily. Simple and natural gestures (right twist, left twist, double tap, etc.) performed on the cane are detected locally on inexpensive, lightweight and low powered microcontroller and only the intended gesture is communicated to the user’s phone. This, in turn, triggers activities on the user’s phone (eg. Reading out the current time, knowing the current location, etc.). Local gesture detection not only saves battery and reduces the latency of gesture detection (VS. streaming all the sensor data to the phone/cloud) but also ensures data is processed locally. Our user studies with 12 participants who are visually impaired and 6 persons who are sighted demonstrated 92%±3% (with 95% confidence) accuracy.