[Zurück]


Beiträge in Tagungsbänden:

M. Mirzaei, P. Kán, H. Kaufmann:
"Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People";
in: "IEEE Virtual Reality and 3D User Interfaces (VR)", IEEE Computer Society, 2021, S. 582 - 587.



Kurzfassung englisch:
This paper presents a novel method for the visualization of 3D spatial sounds in Virtual Reality (VR) for Deaf and Hard-of-Hearing (DHH) people. Our method enhances traditional VR devices with additional haptic and visual feedback, which aids spatial sound localization. The proposed system automatically analyses 3D sound from VR application, and it indicates the direction of sound sources to a user by two Vibro-motors and two Light-Emitting Diodes (LEDs). The benefit of automatic sound analysis is that our method can be used in any VR application without modifying the application itself. We evaluated the proposed method for 3D spatial sound visualization in a user study. Additionally, the conducted user study investigated which condition (corresponding to different senses) leads to faster performance in 3D sound localization task. For this purpose, we compared three conditions: haptic feedback only, LED feedback only, combined haptic and LED feedback. Our study results suggest that DHH participants could complete sound-related VR tasks significantly faster using LED and haptic+LED conditions in comparison to only haptic feedback. The presented method for spatial sound visualization can be directly used to enhance VR applications for use by DHH persons, and the results of our user study can serve as guidelines for the future design of accessible VR systems.

Schlagworte:
Virtual Reality, Haptic, Vision, Sound Localization, Deaf, Hard-of-Hearing


"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)
http://dx.doi.org/10.1109/VR50410.2021.00083


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.