Bizley Lab
  • Home
  • Projects
  • Publications
  • Funding
  • People
  • Outreach
  • SOUNDSCENE
  • Participate
  • MSc projects '25

Masters projects 2025-26

Projects are grouped loosely by theme and are starting points for what a project might look like; methods include behaviour (ferret and human), neurophysiology and computational analysis.

Auditory Space
We have recently designed a large (4m x 2m) arena. Underneath an acoustically transparent floor are 40 speakers which can be used to generate immersive auditory scenes. Traditional studies of sound localisation typically use subjects that are fixed in location in the middle of a speaker ring. Here, by allowing subjects to move, and combining behavioural experiments with high speed videography to understand 3D head movements, and implanted Neuropixels we can study how active sensation supports sound localisation. Masters projects could include analysing behaviour, head movement data, and/or neural data as well as contributing to all phases of data collection. Questions we would like to address include how are motor signals integrated with sensory coding? In what reference frames do auditory cortical neurons encode space? How do head movements support sound perception. Experience in coding would be a strong advantage.

Town, Brimijoin and Bizley, Egocentric and Allocentric Representations in Auditory Cortex https://doi.org/10.1371/journal.pbio.2001878
Wallace et al., Eye saccades align optic flow with retinal specializations during object pursuit in freely moving ferrets, Current Biology 2025 10.1016/j.cub.2024.12.032

Ferret hippocampus
The ferret is a key animal model for auditory and visual research. It also presents an opportunity as an excellent animal for hippocampal and sleep research due to its use of distal sensing strategies and high proportion of REM sleep. However, very little is known about the hippocampus in this species; while we have explored theta rhythms in the ferret, the nature of hippocampal spatial response properties  -such as place cells, grid cells etc - are yet to be determined. We have a dataset of ferret hippocampal recordings, made from chronically  implanted Neuropixels probes, during combinations of sleep, navigation on a linear track and in the open field designed to enable exploration of spatial representations. This project could be entirely computational analysis, or could include additional data collection from currently implanted ferrets.

Norris and Bizley, Ferret contributions to sensory neurobiology, Current Opinion in Neurobiology 2025, https://pubmed.ncbi.nlm.nih.gov/39488005/
Dunn et., Behaviourally modulated hippocampal theta oscillations in the ferret persist during both locomotion and immobility, Nature Communications, 2022 https://www.nature.com/articles/s41467-022-33507-2



Audiovisual integration and auditory selective attention
Seeing a talker's face offers a substantial advantage for listeners in noisy situations. While much of this advantage undoubtably arises from the phonetic information conveyed by mouth movements, through a series of studies in humans and animals we have expored how vision supports listening in a more fundamental way. We have demonstrated that visual information can support auditory scene analysis; in humans, in single neurons in auditory cortex and in trained ferrets. In this project we have trained animals to perform an auditory selective attention task and are recording neural activity in auditory cortex. Our goal is to understand how selective attention and audiovisual integration interact within auditory cortex. A systems neuroscience project would allow you to participate in all elements of data collection, and neural analysis. Experience in coding would be an advantage. A psychophysics project would involve collecting data from human listeners to better understand whether temporal coherence supports auditory scene analysis when discriminating speech in noise.

Atilgan et al., Neuron, 2018 https://pubmed.ncbi.nlm.nih.gov/29395914/
Maddox et al., eLife, 2015 10.7554/eLife.04995.
Alampounti et al., bioRxiv https://www.biorxiv.org/content/10.1101/2025.09.26.678636v1

Tinnitus
Surprisingly little is known about whether listeners with tinnitus show normal abilities to focus on one sound in a mixture. As part of a bigger research project that aims to explore the changes in neural circuits that tinnitus evokes we would like to determine whether human listeners with tinnitus differ in their auditory selective attention abilities. This project will involve testing normal hearing listeners and those with tinnitus in both speech and non-speech based auditory selective attention task, alongside taking measures of their tinnitus (through the tinnitus inventory) and hearing thresholds (via an audiogram). We hypothesise tinnitus may predict how well a listener can engage auditory selective attention.

Roberts, L. E., Husain, F. T. & Eggermont, J. J. Role of attention in the generation and modulation of tinnitus. Neurosci Biobehav Rev 37
, 1754–1773 (2013).
Sedley, W, “Does gain explain?”, Neuroscience, 2019
10.1016/j.neuroscience.2019.01.027

Brain networks supporting auditory scene analysis
This project uses a task outlined in Griffiths et al., 2024, in which ferrets are trained to discriminate continuous speech sounds and need to report the presence of a target word ('instruments'). In this task speech sounds on each trial originate from the same talker and are presented from a single speaker to the left or right of the ferret. Here, the task is adapted so that discrimination is made in the presence of a noise stream which originates from the opposite side of space to the target speech (counterbalanced across trials). Our goal is to understand how auditory cortex supports listening in noise, and to test the hypothesis that PFC plays a key role in attention-dependent shaping of activity in auditory cortex. To do this we have implanted Neuropixels 2.0 probes in AC and PFC and are recording neural activity during task performance.  This project would be ideal for someone with coding experience who wanted to explore functional connectivity measures across brain regions during listening in noise.

Griffiths et al., Plos Comp Bio, 2024 https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1011985


  • Home
  • Projects
  • Publications
  • Funding
  • People
  • Outreach
  • SOUNDSCENE
  • Participate
  • MSc projects '25