
![]() |
---|
![]() |
![]() |
Acoustic Localization and Voice User Interfaces for Robotic Furniture 2025
Recent advances in Natural Language Processing (NLP) have allowed Voice User Interfaces (VUI) to mature to the point where their integration into everyday objects is becoming inevitable. In the coming years, people will increasingly talk to the things around them, and these everyday objects will respond in kind. Furniture in particular, and the built environment in general, are likely to become sites for inconspicuous, sound-based user interfaces. This offers an opportunity to imagine and consider how sound might be integrated into a range of interfaces. This study demonstrates that a single sensor modality, microphones in this case, can be used across multiple applications, both as VUIs and as a means for localization. These kinds of sound-based interfaces have been integrated into the TableBot prototype, which hides in plain sight as a table until called upon to perform a given task. The prototype validates a model for the integration of voice command functionality that can also perform acoustic localization with the same microphones. When you speak to this table, it knows where you are, and it can situate itself to the source of the sound of your command.
Ian Gonsher, John Finberg, Nicolas Perez, Joshua Phelps, and Siddharth Diwan (forthcoming paper accepted for publication)