Timezone: »

Visual Memory for Robust Path Following
Ashish Kumar · Saurabh Gupta · David Fouhey · Sergey Levine · Jitendra Malik

Wed Dec 05 07:45 AM -- 09:45 AM (PST) @ Room 517 AB #122

Humans routinely retrace paths in a novel environment both forwards and backwards despite uncertainty in their motion. This paper presents an approach for doing so. Given a demonstration of a path, a first network generates a path abstraction. Equipped with this abstraction, a second network observes the world and decides how to act to retrace the path under noisy actuation and a changing environment. The two networks are optimized end-to-end at training time. We evaluate the method in two realistic simulators, performing path following and homing under actuation noise and environmental changes. Our experiments show that our approach outperforms classical approaches and other learning based baselines.

Author Information

Ashish Kumar (UC Berkeley)
Saurabh Gupta (UC Berkeley / FAIR / UIUC)
David Fouhey (UC Berkeley)
Sergey Levine (UC Berkeley)
Jitendra Malik (University of California at Berkley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors