Sounds in our everyday environment play a crucial role in guiding our perception and behaviors. The ability to effectively process these sounds—perceive them, contextualize their meaning, and subsequently harness this information to adapt our behaviors—is an intricate yet fundamental process.
Consider, for instance, the familiar experience of recognizing a friend's voice in a bustling crowd, or the moment when a sudden car horn prompts a swift response to ensure safety. These everyday scenarios underscore the significance of auditory perception, which is known to rely on the auditory cortex within the auditory system.
The auditory cortex, a higher-level brain region in the auditory pathway, performs a multifaceted role in processing sounds, beyond the mere analysis of their acoustic features. It is essential for processing spectrotemporally rich sounds like human speech or animal vocalizations, which carry ethological relevance, and contributes to our ability to engage in sound-guided behavior and decision-making. However, how the auditory cortex brings together all the moving parts in our acoustic environment to facilitate a stable auditory perception across contextual variations and time, is still an enigma. This dissertation tackles this challenge by examining neural mechanisms in the auditory cortex at two distinct temporal scales, and its functioning under baseline conditions and in behavioral contexts, providing comprehensive insights into the functioning of the auditory cortex in real-world contexts.
The first study in this dissertation addresses the long-term stability of auditory cortical sound representations, comparing the processing of complex sounds like animal vocalizations, with that of simple sounds like pure tones. By recording the sound-evoked neural responses in the auditory cortex using two-photon calcium imaging, this study provides evidence for the distinction in longitudinal sound representations in the auditory cortex based on the acoustic structure and salience of the auditory inputs. The second study moves to investigate auditory cortical mechanisms in a behavioral context.
In this study, I adapted the classical appetitive trace conditioning paradigm to train mice in predicting the time to reward using a sound cue. By combining electrophysiology, chemogenetic and pharmacological interventions, this study establishes the causal and functional role of auditory cortex and its downstream connection to the posterior striatum in sound-triggered interval timekeeping, at a 1-second temporal resolution. Collectively, these studies offer insights about how neural representations in the auditory cortex can simultaneously encode for auditory and relevant non-auditory information like timing, which are necessary for shaping our consequent actions and behaviors. This work makes an essential contribution to the literature on the various auditory cortical mechanisms aiding in auditory perception but also underscores the importance of recognizing the auditory cortex as a region with broader functions beyond primary auditory processing.