top of page
Robotic Furniture 2018
Robots as furniture, integrating human-computer interfaces into the built environment
What will the universal remote control of the future look like? Will it be an object that is a conspicuous feature of the built environment, like a TV or a computer, or more like a smart speaker? Or, will it take the form of one of the ubiquitous rectangles that we carry in our pockets or that are set on our tables and desks?
What if these kinds of interfaces could hide in plain sight as the furniture around us, to be called upon when needed, and then to fade back into the built environment? This was one of the initial questions we posed that led to the development of Tbo.
Tbo is a “Situated Robot” in the sense that it is both situated and situational. Situated in the sense that it hides in plain sight as furniture, situated within the built environment. But unlike most furniture, it is mobile, too. Tbo helped us explore UX/UI questions that an architect, for example, might be interested in; questions of spatiality and human scale.
But Tbo is also situational, in that it affords the user the conditions for adaptive interactions. We were interested in creating a design model that allowed the user interface to disappear. This approach allowed us to move beyond some of the limitations we had encountered with the Walkerbot, and led us to the development of robots that not only blended more seamlessly within the built environment (i.e. ubiquitous computers) but also provided a more immersive telepresence experience than most other telepresence robots, which are little more than “Skype on a stick.”
Tbo is a proof of concept prototype that provided insights into four areas:
Situated Robotics: Tbo “hides in plain sight” as part of the furniture and built environment. Most of the time, Tbo is a table, which like other tables fits inconspicuously into the environment around it.
Mobility: Unlike most furniture, Tbo can navigate around its environment using SLAM mapping. Tbo can reposition itself in relation to the user and environment, both locally and at a distance (i.e. telepresnece). For example, it can find a wall, position itself in relation to that wall to project a full scale, standing image of a remote user, all without the need for a screen.
Telepresence: In addition to its ability to move about the space, the user can “beam into” these kinds of objects, providing an affordance for connections between distant locations and people. Projection onto nearby walls and surfaces both replaces the need for a screen as well as creating a more immersive experience. By incorporating cameras and a projector, Tbo reshapes the spatial experience of the user by creating “portals” to other, non-contiguous spaces.
Virtual Assistant/Artificial Intelligence: Tablebot is also an integrated virtual assistant platform (using the Watson API) which could be developed further into a whole family of smart, connected objects. Unlike Amazon Alexa or the Facebook Portal, which are spatially fixed, Tablebot is mobile and situated within its environment in ways that expand its functionality considerably.
Gonsher, I and Kim, S. (2020) Robots as Furniture, Integrating Human-Computer Interfaces into the Built Environment. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’20 Companion), March 23–26, 2020, Cambridge, United Kingdom. ACM, New York, NY, USA, 3 pages. https://doi.org/10.1145/3371382.3378235
Gonsher, I. (2018). Demo hour. ACM Interactions 25, 4 (June 2018), 8-11. DOI: https://doi.org/10.1145/3226034
Ian Gonsher (PI)
Maartje de Graaf
bottom of page