You can’t find your phone. Fortunately, even though you left it on silent, you know that if you call it, it should vibrate loudly enough for you to track it down. You stand in the doorway of the living room using your flatmate’s phone to ring your’s while listening intently for the tell-tale buzzing sound… which you eventually realise is coming from under the Saturday paper that’s been dumped on top of the coffee table. As you move around the sofa and across the room to retrieve your phone, its sound alters its position with respect to you, moving from your right, to in front of you, but all the time remaining static in the world. Of course it remains static in the world you say; phones are pretty clever but they haven’t yet got legs! But the only information your brain has to figure out where a sound is, is from “cues” that are extracted by the brain by comparing the sound waves arriving at the two ears. Sounds on your right will hit the right ear sooner than the left ear because its closer, and will also be louder, because the head blocks some of the sound from reaching the far ear. So, whether you move, or the sound source moves, the pattern of cues at the ears will change. How does the brain work out which of these possibilities is the case? And how are you able to tell where a sound comes from in the world when all you have available to you is the direction that the sound is in relative to yourself?
Our recent paper cast some light on these problems. Most studies of sound localisation – whether testing human listeners, or auditory neurons in the brain – have measured responses to sounds presented from a ring of speakers that surrounds a stationary listener at the centre the the speaker ring. With this experimental design its not possible to tell whether a neuron is tuned to a position in space defined relative to the listener (to the right of the head) or whether a neuron fires in response to sounds that come from the right of the speaker ring. These two possibilities – named ‘head-centered’ and ‘world-centered’ respectively – can be disambiguated if you allow the listener to move around. If a listener moves the sounds occurring from the right of the room will, at different points in time, come from a variety of locations relative to the head as the listener turns to face towards or away from the sound source. We took advantage of this by recording neural activity in freely moving ferrets who were foraging in an arena that was surrounded by a ring of speakers emitting click sounds. By precisely tracking the position of the head we were able to map out neural responses as a function of where the sound was relative to the head, and where the sound source came from in the world. Our hypothesis was that if neurons in auditory cortex represent sound location in head-centered coordinates we would only see spatial tuning when we considered where the sound was compared to the head. If however, a neuron represented the location of the sound in the world, its firing should be modulated by where the sound actually came from rather than is position with respect to the head.
We found many neurons in auditory cortex that represent sound in head-centered co-ordinates: this confirms what scientists previously believed but hadn’t tested. Sound localisastion cues are defined relative to the head so its perhaps unsurprising that that is the coordinate frame in which they are represented in auditory cortex. However, we found a smaller population of neurons that responded to a sounds actual position in the world. To do this these neurons must have integrated information about the animal’s own movements with visual cues that provide a map of the environment. Unlike head-centered cells, whose responses will change as our head position changes but sound source remains still, such world-centered cells produce stable responses across self-movement, potentially providing us with the ability to solve the problem of whether we moved or a sound source moved, and contributing to our stable perception of sound source location when we move through the environment.
We’re super-excited to have recently been awarded a BBSRC grant to take these findings further: our original experimental design was optimised for measuring head-centered responses and may therefore have underestimated the prevalence of world-centered tuning. In our new experiments we’ll be able to design an arena where instead of being enclosed within a circle of sound sources, the ferrets will be able to move amongst them. We will be looking at how spatial coding changes across different auditory cortical subfields, and how sound source distance is represented in the brain.