My Caregiver the Cobot: Comparing Visualization Techniques to Effectively Communicate Cobot Perception to People with Physical Impairments
Abstrak
Nowadays, robots are found in a growing number of areas where they collaborate closely with humans. Enabled by lightweight materials and safety sensors, these cobots are gaining increasing popularity in domestic care, where they support people with physical impairments in their everyday lives. However, when cobots perform actions autonomously, it remains challenging for human collaborators to understand and predict their behavior, which is crucial for achieving trust and user acceptance. One significant aspect of predicting cobot behavior is understanding their perception and comprehending how they “see” the world. To tackle this challenge, we compared three different visualization techniques for Spatial Augmented Reality. All of these communicate cobot perception by visually indicating which objects in the cobot’s surrounding have been identified by their sensors. We compared the well-established visualizations <i>Wedge</i> and <i>Halo</i> against our proposed visualization <i>Line</i> in a remote user experiment with participants suffering from physical impairments. In a second remote experiment, we validated these findings with a broader non-specific user base. Our findings show that <i>Line</i>, a lower complexity visualization, results in significantly faster reaction times compared to <i>Halo</i>, and lower task load compared to both <i>Wedge</i> and <i>Halo</i>. Overall, users prefer <i>Line</i> as a more straightforward visualization. In Spatial Augmented Reality, with its known disadvantage of limited projection area size, established off-screen visualizations are not effective in communicating cobot perception and <i>Line</i> presents an easy-to-understand alternative.
Topik & Kata Kunci
Penulis (6)
Max Pascher
Kirill Kronhardt
Til Franzen
Uwe Gruenefeld
Stefan Schneegass
Jens Gerken
Akses Cepat
- Tahun Terbit
- 2022
- Sumber Database
- DOAJ
- DOI
- 10.3390/s22030755
- Akses
- Open Access ✓