Article

Perception of reverberation with congruent and incongruent visual representation of the scene

* Presenting author
Day / Time: 21.03.2018, 09:20-09:40
Room: Interim 2
Typ: Vortrag (strukturierte Sitzung)
Abstract: In everyday environments, sound doesn't only travel directly from a source to a listener, but also bounces off walls and objects. Humans have evolved mechanisms to suppress distracting early reflections, summarized as the precedence effect. Visual information about an acoustic scene influences this mechanism's effectiveness. It has also been shown that similar compensation effects can occur for later, more acoustically complex echoes (i.e., reverberation), and that humans can estimate some important acoustic properties of a room when they see it. Taking these findings together, we hypothesize that the visual impression of a room leads to a reduction in its perceived reverberation.We test this hypothesis in a highly immersive audio-visual virtual reality environment built upon a ring of loudspeakers and a head-mounted display. In a magnitude estimation paradigm, subjects are asked to judge the perceived degree of reverberation in conditions where the simultaneously presented acoustic and visual stimuli which either match regarding the room environment, sound source azimuth, and sound source distance, or diverge in one of those aspects. Audio-only control conditions are used as a baseline. Results from twenty normal-hearing subjects will be analyzed to assess whether perceived reverberation changes between audio-visually congruent and incongruent conditions.