Auditory neural circuits in the fly brain
Professor Azusa Kamikouchi
How does the brain process acoustic information? Mapping the auditory neural circuits is indispensable to answer this question. The fruit fly is ideally suited for such tasks, with its small brain size and a rich repertoire of genetic tools. Moreover, they use acoustic signals to communicate with each other. Toward comprehensive identification of auditory neural circuits in the fly brain, this study systematically identified the auditory sensory neurons and their downstream neurons. The anatomic and functional analyses revealed frequency segregation at the first layer of the auditory pathway and the convergence of frequency information in the subsequent downstream pathways. Second-order auditory neurons have intensive binaural interactions, raising the possibility that the fly is capable of comparing acoustic signals detected at the left and right ears. Based on analysis, this research established the first comprehensive map of primary and secondary auditory neurons in the fly brain, which are characterized by frequency segregation and convergence, binaural interaction, and multimodal pathways.
Neural mechanisms for dynamic acoustic communication in flies
Social interactions require continually adjusting behavior in response to sensory feedback. For example, when having a conversation, sensory cues from a partner (e.g., sounds or facial expressions) affect speech patterns in real time. Human speech signals, in turn, are the sensory cues that modify a partner’s actions. What are the underlying computations and neural mechanisms that govern these interactions? To address these questions, lab studies the acoustic communication system of Drosophila. As an advantage, the fly nervous system is relatively simple, with a wealth of genetic tools to interrogate it. Importantly, Drosophila acoustic behaviors are highly quantifiable and robust. During courtship, males produce time-varying songs via wing vibration, while females arbitrate mating decisions. This study discovered that, rather than being a stereotyped fixed action sequence, male song structure and intensity are continually sculpted by interactions with the female, over timescales ranging from tens of milliseconds to minutes – and this research is mapping the underlying circuits and computations. This research has also developed methods to relate song representations in the female brain to changes in her behavior, across multiple timescales. The focus on natural acoustic signals, either as the output of the male nervous system or as the input to the female nervous system, provides a powerful, quantitative handle for studying the basic building blocks of communication.
Reconciling perceptual and physiological measures of frequency selectivity in the mammalian auditory system
Dr Christian Sumner, University of Nottingham, UK
Neural codes for communication signals and sequences in the primate brain
Professor Christopher Petkov
Many animals are not thought to be able to combine their vocalizations into structured sequences, as do songbirds, humans and a few other species. Nonetheless, it remains possible that these many animals are able to recognize ordering relationships in sequences generated by ‘artificial grammars’. This talk will explore how understanding the extent of these hidden receptive learning abilities could clarify the neurobiological origins of language. First, an overview behavioral results with structured sequence learning in three primate species: marmosets, macaques and humans. Then a focus on brain imaging results identifying evolutionarily conserved frontal brain regions in macaques and humans involved in predicting events that occur next in a sequence. Finally, results are presented from a new study involving comparative intracranial recordings in humans and monkeys processing the sequences. Overall, the findings indicate that human and non-human primates possess an evolutionarily conserved neural network involved in processing structured auditory input and they provide hints on how the human brain differentiated for language.
Adaptive coding in the central auditory system
Professor Andrew King
If we are to understand how activity in the brain gives rise to auditory perception and guides behaviour, it is essential to consider the way in which neural processing is shaped both by the sensory and behavioural context in which sounds occur and by lifelong changes in experience that refine or degrade perceptual abilities as a result of learning or hearing loss. This talk will consider the neural circuits and strategies that enable the auditory system to adjust to the statistics of the auditory scene, as well as to longer lasting changes in inputs that result from hearing impairments. In addition to providing insights into the adaptive capabilities of the auditory system, findings indicate that different forms of plasticity may represent therapeutic targets for restoring perceptual abilities following hearing loss.
Plenary and final comments