Haptic and Auditory Feedback on Immersive Media in Virtual Reality | Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology

Inhalt

In Virtual Reality (VR), visual and auditory sensations are effectively leveraged to create immersive experiences. However, touch is significantly underutilized in immersive media. We enhance the VR image viewing experience by integrating haptic and auditory feedback into 3D environments constructed from immersive media. We address the challenges of utilizing depth maps from various image formats to create intractable environments. The VR experience is enhanced using vibrohaptic feedback and audio cues triggered by controller collisions with haptic materials.

2023. Touchly. https://touchly.app

2024. Haptic Sutdio. https://developer.oculus.com/resources/haptics-studio/

2024. immerGallery. https://immervr.com

2024. Immersity AI. https://www.immersity.ai/

Shariq Farooq Bhat, Reiner Birkl, Diana Wofk, Peter Wonka, and Matthias Müller. 2023. ZoeDepth: Zero-shot Transfer by Combining Relative and Metric Depth. arxiv:2302.12288

Frank Biocca, Yasuhiro Inoue, Andy Lee, and Heather Polinsky. 2002. Visual cues and virtual touch: Role of visual stimuli and intersensory integration in cross-modal haptic illusions and the sense of presence. (01 2002).

Yue Ming, Xuyang Meng, Chunxiao Fan, and Hui Yu. 2021. Deep learning for monocular depth estimation: A review. Neurocomputing 438 (2021), 14–33. https://doi.org/10.1016/j.neucom.2020.12.089

Tomas Novacek and Marcel Jirina. 2020. Overview of Controllers of User Interface for Virtual Reality. PRESENCE: Virtual and Augmented Reality 29 (12 2020), 37–90. https://doi.org/10.1162/pres_a_00356

Manuel Rey-Area, Mingze Yuan, and Christian Richardt. 2022. 360MonoDepth: High-Resolution 360° Monocular Depth Estimation. In CVPR.

Antony Tang, Mark Billinghurst, Samuel Rosset, and Iain Anderson. 2023. Enhancing Virtual Material Perception with Vibrotactile and Visual Cues. In 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 1011–1012. https://doi.org/10.1109/VRW58643.2023.00351

  • Downloads (Last 12 months)75
  • Downloads (Last 6 weeks)75

View or Download as a PDF file.

View online with eReader.

View this article in HTML Format.

Zusammenfassen
The article discusses the underutilization of touch in Virtual Reality (VR) and proposes enhancements to the VR image viewing experience by integrating haptic and auditory feedback within 3D environments. It addresses the challenges of using depth maps from various image formats to create interactive environments. The authors enhance the VR experience through vibrohaptic feedback and audio cues that are activated when controllers interact with haptic materials. This integration aims to create a more immersive experience by leveraging multiple sensory inputs, particularly focusing on the tactile aspect, which has been less explored compared to visual and auditory elements. The article references various studies and resources related to depth estimation, haptic feedback, and user interface controllers in VR, highlighting the ongoing research and development in this field. The authors emphasize the importance of cross-modal integration to improve the sense of presence in virtual environments, ultimately aiming to create a more engaging and realistic VR experience for users.