Skip to content

generalroboticslab/SonicSense

Repository files navigation

SonicSense: Object Perception from In-Hand Acoustic Vibration

Jiaxun Liu, Boyuan Chen
Duke University

Overview

We introduce SonicSense, a holistic design of hardware and software to enable rich robot object perception through in-hand acoustic vibration sensing. While previous studies have shown promising results with acoustic sensing for object perception, current solutions are constrained to a handful of objects with simple geometries and homogeneous materials, single-finger sensing, and mixing training and testing on the same objects. SonicSense enables container inventory status differentiation, heterogeneous material prediction, 3D shape reconstruction, and object re-identification from a diverse set of 83 real-world objects. Our system employs a simple but effective heuristic exploration policy to interact with the objects as well as end-to-end learning-based algorithms to fuse vibration signals to infer object properties. Our framework underscores the significance of in-hand acoustic vibration sensing in advancing robot tactile perception.

teaser

Code Structure

We provide detailed instructions on running our code for material classification, shape reconstruction and object re-identification under each subdirectory. Please refer to specific README files under each directory.

The full CAD model and instruction of our hardware design are under Hardware_instruction subdirectory.

Citation

If you find our paper or codebase helpful, please consider citing:

@inproceedings{
liu2024sonicsense,
title={SonicSense: Object Perception from In-Hand Acoustic Vibration},
author={Jiaxun Liu and Boyuan Chen},
booktitle={8th Annual Conference on Robot Learning},
year={2024},
url={https://openreview.net/forum?id=CpXiqz6qf4}
}

License

This repository is released under the Apache License 2.0. See LICENSE for additional details.

Acknowledgement

Point Cloud Renderer, PyLX-16A

This work is supported by ARL STRONG program under awards W911NF2320182 and W911NF2220113, by DARPA FoundSci program under award HR00112490372, and DARPA TIAMAT program under award HR00112490419.