Last week was spent in Dalian, China, at the World Economic Forums Annual Meeting of New Champions, or ‘summer Davos’ as the AMNC is known. I was nominated by the Wellcome Trust to be one of the 55 Young Scientists nominated from around the globe to participate in the 2017 meeting. To be honest, I had no idea why I was going, or what I was going to do when I got there! Apparently the role of the YS is to bring scientific input to the meeting both formally, by participating in various panels and discussion groups and less formally through attending one of the dizzying program of sessions. Sessions ranged from discussions for groups of 10-20 people, to Ideas Labs, for more like 40 people to sessions in the Arena – a circular venue seating several 100 people. Many sessions were conducted in both English and Mandarin with simultaneous translation. Themes ranged from energy, to neuroscience, to robotics, material sciences, and countless other issues of societal importance. I particularly enjoyed the Ideas labs which were 75 mins long, and featured three 5 minute – extremely slick and polished – talks from scientists interspersed with group discussion in tables of about 10 people focussed on what we liked (on green paper), were worried about (red paper) or wanted to know (yellow paper). While the session evolved, an artist made a massive mind map encompassing the themes. The sessions were moderated by science communicators, NPR correspondents who skilfully led and guided the discussion. Since each table contained a mixture of business men, scientists, entrepreneurs, media, engineers and almost anything else you can imagine it was a great way mix skills, and experiences. Several sessions focussed on the opportunities and risks posed by virtual reality, augmented reality and artificial intelligence across a number of settings from the creative economy to therapy to gaming to brain-computer interfaces. Another key theme both within and beyond the YS community was the issue of trust in scientists – and in experts more generally. Something I’ve been puzzling about lately and which we spent a lot of time discussing is how to reach those who aren’t traditionally reached by science communication and how to solve the conundrum that while scientists are trusted (at least compared to politicians, bankers, estate agents) that doesn’t necessarily mean people change their behaviour on issues like climate change or vaccination. Imran Khan (Wellcome Trust) had a pretty convincing argument that communicating with those that already have a passion for science is equally important - fostering the science loving non-scientists allows them to be our ambassadors. Watch this space for a more official YS line encouraging our generation of scientists to reach out to the public as never before – personally, informally, locally.
You can’t find your phone. Fortunately, even though you left it on silent, you know that if you call it, it should vibrate loudly enough for you to track it down. You stand in the doorway of the living room using your flatmate’s phone to ring your’s while listening intently for the tell-tale buzzing sound… which you eventually realise is coming from under the Saturday paper that’s been dumped on top of the coffee table. As you move around the sofa and across the room to retrieve your phone, its sound alters its position with respect to you, moving from your right, to in front of you, but all the time remaining static in the world. Of course it remains static in the world you say; phones are pretty clever but they haven’t yet got legs! But the only information your brain has to figure out where a sound is, is from “cues” that are extracted by the brain by comparing the sound waves arriving at the two ears. Sounds on your right will hit the right ear sooner than the left ear because its closer, and will also be louder, because the head blocks some of the sound from reaching the far ear. So, whether you move, or the sound source moves, the pattern of cues at the ears will change. How does the brain work out which of these possibilities is the case? And how are you able to tell where a sound comes from in the world when all you have available to you is the direction that the sound is in relative to yourself?
Our recent paper cast some light on these problems. Most studies of sound localisation – whether testing human listeners, or auditory neurons in the brain – have measured responses to sounds presented from a ring of speakers that surrounds a stationary listener at the centre the the speaker ring. With this experimental design its not possible to tell whether a neuron is tuned to a position in space defined relative to the listener (to the right of the head) or whether a neuron fires in response to sounds that come from the right of the speaker ring. These two possibilities – named ‘head-centered’ and ‘world-centered’ respectively – can be disambiguated if you allow the listener to move around. If a listener moves the sounds occurring from the right of the room will, at different points in time, come from a variety of locations relative to the head as the listener turns to face towards or away from the sound source. We took advantage of this by recording neural activity in freely moving ferrets who were foraging in an arena that was surrounded by a ring of speakers emitting click sounds. By precisely tracking the position of the head we were able to map out neural responses as a function of where the sound was relative to the head, and where the sound source came from in the world. Our hypothesis was that if neurons in auditory cortex represent sound location in head-centered coordinates we would only see spatial tuning when we considered where the sound was compared to the head. If however, a neuron represented the location of the sound in the world, its firing should be modulated by where the sound actually came from rather than is position with respect to the head.
We found many neurons in auditory cortex that represent sound in head-centered co-ordinates: this confirms what scientists previously believed but hadn’t tested. Sound localisastion cues are defined relative to the head so its perhaps unsurprising that that is the coordinate frame in which they are represented in auditory cortex. However, we found a smaller population of neurons that responded to a sounds actual position in the world. To do this these neurons must have integrated information about the animal’s own movements with visual cues that provide a map of the environment. Unlike head-centered cells, whose responses will change as our head position changes but sound source remains still, such world-centered cells produce stable responses across self-movement, potentially providing us with the ability to solve the problem of whether we moved or a sound source moved, and contributing to our stable perception of sound source location when we move through the environment.
We’re super-excited to have recently been awarded a BBSRC grant to take these findings further: our original experimental design was optimised for measuring head-centered responses and may therefore have underestimated the prevalence of world-centered tuning. In our new experiments we’ll be able to design an arena where instead of being enclosed within a circle of sound sources, the ferrets will be able to move amongst them. We will be looking at how spatial coding changes across different auditory cortical subfields, and how sound source distance is represented in the brain.