Timezone: »
Humans manipulate objects using all of their senses, including sound and touch: audio can indicate whether or not the door has been unlocked or an egg has been properly cracked. Prior work has shown that humans can use auditory feedback alone to categorize types of events and infer continuous aspects of these events, such as the length of a wooden dowel being struck [1]. However, microphones remain underexplored in robotics, especially their potential as tactile vibration sensors.In this work, we investigate contact audio as an alternative tactile modality for complex manipulation tasks that are challenging from vision alone. Contact microphones record vibrations of anything in direct contact at a high-frequency (1000 times higher frequency than the next common tactile sensor [2]). This makes them well-suited to use as tactile sensors when interacting with objects in manipulation. Furthermore, contact audio is immune to many aspects of environment variation that vision is plagued by, such as lighting and color variation, making it promising for transfer learning and multi-task settings that are common in robotics.
Author Information
Shaden Alshammari (Massachusetts Institute of Technology)
Victoria Dean (Carnegie Mellon University)
Tess Hellebrekers (Meta AI)
Pedro Morgado (University of Wisconsin - Madison)
Abhinav Gupta (Carnegie Mellon University Robotics Institute)
More from the Same Authors
-
2021 : KitchenShift: Evaluating Zero-Shot Generalization of Imitation-Based Policy Learning Under Domain Shifts »
Eliot Xing · Abhinav Gupta · Samantha Powers · Victoria Dean -
2022 : Multispectral Masked Autoencoder for Remote Sensing Representation Learning »
Yibing Wei · Zhicheng Yang · Hang Zhou · Mei Han · Pedro Morgado · Jui-Hsin Lai -
2022 : Shared Hardware, Shared Baselines: An Offline Robotics Benchmark »
Gaoyue Zhou · Victoria Dean -
2022 : Train Offline, Test Online: A Real Robot Learning Benchmark »
Gaoyue Zhou · Victoria Dean · Mohan Kumar Srirama · Aravind Rajeswaran · Jyothish Pari · Kyle Hatch · Aryan Jain · Tianhe Yu · Pieter Abbeel · Lerrel Pinto · Chelsea Finn · Abhinav Gupta -
2022 : Real World Offline Reinforcement Learning with Realistic Data Source »
Gaoyue Zhou · Liyiming Ke · Siddhartha Srinivasa · Abhinav Gupta · Aravind Rajeswaran · Vikash Kumar -
2022 : Train Offline, Test Online: A Real Robot Learning Benchmark »
Gaoyue Zhou · Victoria Dean · Mohan Kumar Srirama · Aravind Rajeswaran · Jyothish Pari · Kyle Hatch · Aryan Jain · Tianhe Yu · Pieter Abbeel · Lerrel Pinto · Chelsea Finn · Abhinav Gupta -
2022 : Real World Offline Reinforcement Learning with Realistic Data Source »
Gaoyue Zhou · Liyiming Ke · Siddhartha Srinivasa · Abhinav Gupta · Aravind Rajeswaran · Vikash Kumar -
2022 Poster: Learning State-Aware Visual Representations from Audible Interactions »
Himangi Mittal · Pedro Morgado · Unnat Jain · Abhinav Gupta -
2022 Poster: A Closer Look at Weakly-Supervised Audio-Visual Source Localization »
Shentong Mo · Pedro Morgado -
2021 Oral: Interesting Object, Curious Agent: Learning Task-Agnostic Exploration »
Simone Parisi · Victoria Dean · Deepak Pathak · Abhinav Gupta -
2021 Poster: Interesting Object, Curious Agent: Learning Task-Agnostic Exploration »
Simone Parisi · Victoria Dean · Deepak Pathak · Abhinav Gupta -
2020 Workshop: Differentiable computer vision, graphics, and physics in machine learning »
Krishna Murthy Jatavallabhula · Kelsey Allen · Victoria Dean · Johanna Hansen · Shuran Song · Florian Shkurti · Liam Paull · Derek Nowrouzezahrai · Josh Tenenbaum -
2020 : Opening remarks »
Krishna Murthy Jatavallabhula · Kelsey Allen · Johanna Hansen · Victoria Dean