Technical Program

Paper Detail

Paper: PS-2A.51
Session: Poster Session 2A
Location: H Lichthof
Session Time: Sunday, September 15, 17:15 - 20:15
Presentation Time:Sunday, September 15, 17:15 - 20:15
Presentation: Poster
Publication: 2019 Conference on Cognitive Computational Neuroscience, 13-16 September 2019, Berlin, Germany
Paper Title: Computing Sound Space: World-centered Sound Localization in Ferrets
Manuscript:  Click here to view manuscript
License: Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
DOI: https://doi.org/10.32470/CCN.2019.1113-0
Authors: Stephen Town, Jennifer Bizley, University College London, United Kingdom
Abstract: The ability to localize sounds is central to healthy hearing. We can perceive sound location in multiple coordinate systems including those defined by the observer (e.g. “the phone is on my right”) or by the environment (e.g. “the phone is in the office”). Although we can describe sound locations in multiple spaces, the coordinate frames in which non-human animals can perceive sounds remains unclear. Here, we designed a task that required subjects (ferrets) to report the location of sounds in the world across changes in head pose. We developed simulations of the task using world-centered (allocentric) or head-centered (egocentric) models of spatial processing, and compared model predictions to animal behavior. We found that observed behavior most closely matched performance of allocentric models, indicating that subjects solved the task using a world-centered strategy. Our findings indicate that ferrets, like humans, can perceive allocentric sound space and thus abstract sound location beyond momentary head-centered acoustic cues.