Perception Deception: Audio-Visual Mismatch in Virtual Reality Using The McGurk Effect.

AbuBakr Siddig, Pheobe Wenyi Sun, Matthew Parker, Andrew Hines

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The audio-visual synchronisation is a big challenge in the
Virtual Reality (VR) industry. Studies investigating the effect of incongruent multisensory stimuli will have a direct impact on the design of
immersive experience. In this paper, we explored the effect of audio-visual
mismatch on the sensory integration in a VR context. Inspired by the
McGurk effect, we designed an experiment addressing a few critical VR
content production concerns today including sound spatialisation and
unisensory signal quality. The results confirm previous studies using 2D
videos where audio spatial separation has no significant impact on the
McGurk effect, yet the findings raise new thoughts regarding the future
compression and multisensory signal design strategies to optimise the
perceptual immersion in the 3D context.
Original languageEnglish
Title of host publication Proceedings of the 27th Irish Conference on Artificial Intelligence and Cognitive Science
PublisherCEUR-WS
Volume2563
Publication statusPublished - 2019

Publication series

NameProceedings of the 27th Irish Conference on Artificial Intelligence and Cognitive Science

Keywords

  • Ambisonics
  • Virtual Reality
  • McGurk effect

Fingerprint

Dive into the research topics of 'Perception Deception: Audio-Visual Mismatch in Virtual Reality Using The McGurk Effect.'. Together they form a unique fingerprint.

Cite this