Audio Communication Group

Perceptual learning and neural plasticity in synthetic worlds. The case of distance perception

Our auditory system with its ability to orient itself in natural acoustic environments can today be considered well researched in many respects. With virtual and augmented realities becoming part of our everyday experience, however, the question becomes important to what extent our perceptual system is able to orient itself in a reality that is not subject to physical laws. Virtual environments can deviate from the rules of the physical world, because of the limited performance of the underlying numerical simulations, because of the limited interaction possibilities of a user with the virtual environment, or because the quality of experience of the virtual environment may consist precisely in crossing the physical boundaries of the real world.

The successful learning of non-physical cues can be considered as a form of neuroplasticity, i.e. an adaptation of the cortical organization of the perceptual apparatus to new experiences and demands. The plasticity of the human auditory system has been shown for source localization with respect to the source azimuth and elevation. In this case, subjects are able to adapt to modified localization cues within approximately one week if training with sensory-motor feedback is performed.

Far less well researched is the question to what extent the perception of distancecan be learned in physical environments and can be learned or remapped in non-physical worlds. In addition, the field of audio-visualspatial perception raises the question of the relative weighting and interaction of auditory and visual information. We will therefore combine the production of acoustic stimuli using advanced methods of room acoustical simulation, binaural synthesis, and visual 3-D rendering to conduct psycho-acoustic investigations on the learning of physical and remapped, non-physical cues on distance perception with an investigation of the neural mechanisms of distance perception and plasticity.

Project Team


DFG WE 4057/21-1


Arend, J. M., Amengual Garí, S. V., Schissler, C., Klein, F. & Robinson, P. W. (2021). Six-Degrees-of-Freedom Parametric Spatial Audio Based on One Monaural Room Impulse Response. J. Audio Eng. Soc., 69(7/8), 557–575.

Arend, J. M., Brinkmann, F. & Pörschmann, C. (2021). Assessing Spherical Harmonics Interpolation of Time-Aligned Head-Related Transfer Functions. J. Audio Eng. Soc., 69(1/2), 104–117.

Arend*, J. M., Ramírez*, M., Liesefeld, H. R. & Pörschmann, C. (2021). Do near-field cues enhance the plausibility of non-individual binaural rendering in a dynamic multimodal virtual acoustic scene? Acta Acust, 5(55), 1‐14, (*equal contributions).

Brinkmann, F., Aspöck, L., Ackermann, D., Steffen, H., Vorländer, M. & Weinzierl, S. (2019). A round robin on room acoustical simulation and auralization. J. Acoust. Soc. Am., 145(4), 2746–2760.

Brinkmann, F., Lindau, A. & Weinzierl, S. (2017). On the authenticity of individual dynamic binaural synthesis. J. Acoust. Soc. Am., 142(4), 1784–1795.

Brinkmann, F., Lindau, A., Weinzierl, S., van Par, S. de, Müller-Trapet, M., Opdam, R. & Vorländer, M. (2017). A High Resolution and Full-Spherical Head-Related Transfer Function Database for Different Head-Above-Torso Orientations. J. Audio Eng. Soc, 65(10), 841–848.