Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Touch Processing: a new Sensing Modality for AI

MimicTouch: Learning Human's Control Strategy with Multi-Modal Tactile Feedback

Kelin Yu · Yunhai Han · Matthew Zhu · Ye Zhao


Abstract:

In the evolving landscape of robotics and automation, the application of touch processing is crucial, particularly in learning to execute intricate tasks like insertion. However, existing works focusing on tactile methods for insertion tasks predominantly rely on sensor data and do not utilize the rich insights provided by human tactile feedback. For utilizing human sensations, methodologies related to learning from humans predominantly leverage visual feedback, often overlooking the invaluable tactile insights that humans inherently employ to finish complex manipulations. Addressing this gap, we introduce "MimicTouch", a noval framework that mimics a human's tactile-guided control strategy. In this framework, we initially collect multi-modal tactile datasets from human demonstrators, incorporating human tactile-guided control strategies for task completion. The subsequent step involves instructing robots through imitation learning using multi-modal sensor data and retargeted human motions. To further mitigate the embodiment gap between humans and robots, we employ online residual reinforcement learning on the physical robot. Through comprehensive experiments, we validate the safety of MimicTouch in transferring a latent policy learned through imitation learning from human to robot. This ongoing work will pave the way for a broader spectrum of tactile-guided robotic applications.

Chat is not available.