Abstract
In recent years, assistive technology and digital accessibility
for blind and visually impaired people (BVIP) has been signi cantly
improved. Yet, group discussions, especially in a business context, are
still challenging as non-verbal communication (NVC) is often depicted
on digital whiteboards, including deictic gestures paired with visual artifacts.
However, as NVC heavily relies on the visual perception, which represents a large amount of detail, an adaptive approach is required that identi es the most relevant information for BVIP. Additionally, visual artifacts usually rely on spatial properties such as position, orientation, and dimensions to convey essential information such as hierarchy, cohesion, and importance that is often not accessible to the BVIP. In this paper, we investigate the requirements of BVIP during brainstorming sessions and, based on our ndings, provide an accessible multimodal tool that uses non-verbal and spatial cues as an additional layer of information. Further, we contribute by presenting a set of input and output modalities that encode and decode information with respect to the individual demands of BVIP and the requirements of di erent use cases. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000440125Publication status
publishedExternal links
Book title
Computers Helping People with Special Needs, 17th International Conference, ICCHP 2020, Lecco, Italy, September 9–11, 2020, Proceedings, Part IIJournal / series
Lecture Notes in Computer ScienceVolume
Pages / Article No.
Publisher
Springer International PublishingEvent
Subject
Virtual Reality; BLIND + VISUALLY IMPAIRED (PERSONS)Organisational unit
08844 - Kunz, Andreas (Tit.-Prof.) / Kunz, Andreas (Tit.-Prof.)
Notes
Conference lecture held on September 11, 2020More
Show all metadata
ETH Bibliography
yes
Altmetrics