-
Frontiers in Neural Circuits 2021Movement has a prominent impact on activity in sensory cortex, but has opposing effects on visual and auditory cortex. Both cortical areas feature a vasoactive...
Movement has a prominent impact on activity in sensory cortex, but has opposing effects on visual and auditory cortex. Both cortical areas feature a vasoactive intestinal peptide-expressing (VIP) disinhibitory circuit, which in visual cortex contributes to the effect of running. In auditory cortex, however, the role of VIP circuitry in running effects remains poorly understood. Running and optogenetic VIP activation are known to differentially modulate sound-evoked activity in auditory cortex, but it is unknown how these effects vary across cortical layers, and whether laminar differences in the roles of VIP circuitry could contribute to the substantial diversity that has been observed in the effects of both movement and VIP activation. Here we asked whether VIP neurons contribute to the effects of running, across the layers of auditory cortex. We found that both running and optogenetic activation of VIP neurons produced diverse changes in the firing rates of auditory cortical neurons, but with distinct effects on spontaneous and evoked activity and with different patterns across cortical layers. On average, running increased spontaneous firing rates but decreased evoked firing rates, resulting in a reduction of the neuronal encoding of sound. This reduction in sound encoding was observed in all cortical layers, but was most pronounced in layer 2/3. In contrast, VIP activation increased both spontaneous and evoked firing rates, and had no net population-wide effect on sound encoding, but strongly suppressed sound encoding in layer 4 narrow-spiking neurons. These results suggest that VIP activation and running act independently, which we then tested by comparing the arithmetic sum of the two effects measured separately to the actual combined effect of running and VIP activation, which were closely matched. We conclude that the effects of locomotion in auditory cortex are not mediated by the VIP network.
Topics: Animals; Auditory Cortex; Interneurons; Locomotion; Mice; Neural Inhibition; Neurons; Optogenetics; Vasoactive Intestinal Peptide; Visual Cortex
PubMed: 33897378
DOI: 10.3389/fncir.2021.618881 -
Cerebral Cortex (New York, N.Y. : 1991) May 2020Auditory cortex (AC) is necessary for the detection of brief gaps in ongoing sounds, but not for the detection of longer gaps or other stimuli such as tones or noise. It...
Auditory cortex (AC) is necessary for the detection of brief gaps in ongoing sounds, but not for the detection of longer gaps or other stimuli such as tones or noise. It remains unclear why this is so, and what is special about brief gaps in particular. Here, we used both optogenetic suppression and conventional lesions to show that the cortical dependence of brief gap detection hinges specifically on gap termination. We then identified a cortico-collicular gap detection circuit that amplifies cortical gap termination responses before projecting to inferior colliculus (IC) to impact behavior. We found that gaps evoked off-responses and on-responses in cortical neurons, which temporally overlapped for brief gaps, but not long gaps. This overlap specifically enhanced cortical responses to brief gaps, whereas IC neurons preferred longer gaps. Optogenetic suppression of AC reduced collicular responses specifically to brief gaps, indicating that under normal conditions, the enhanced cortical representation of brief gaps amplifies collicular gap responses. Together these mechanisms explain how and why AC contributes to the behavioral detection of brief gaps, which are critical cues for speech perception, perceptual grouping, and auditory scene analysis.
Topics: Acoustic Stimulation; Animals; Auditory Cortex; Auditory Pathways; Auditory Perception; Inferior Colliculi; Mice; Neural Pathways; Neurons; Optogenetics; Signal Detection, Psychological; Time Perception
PubMed: 32055848
DOI: 10.1093/cercor/bhz328 -
Molecular Brain Jun 2023Neuronal tuning for spectral and temporal features has been studied extensively in the auditory system. In the auditory cortex, diverse combinations of spectral and...
Neuronal tuning for spectral and temporal features has been studied extensively in the auditory system. In the auditory cortex, diverse combinations of spectral and temporal tuning have been found, but how specific feature tuning contributes to the perception of complex sounds remains unclear. Neurons in the avian auditory cortex are spatially organized in terms of spectral or temporal tuning widths, providing an opportunity for investigating the link between auditory tuning and perception. Here, using naturalistic conspecific vocalizations, we asked whether subregions of the auditory cortex that are tuned for broadband sounds are more important for discriminating tempo than pitch, due to the lower frequency selectivity. We found that bilateral inactivation of the broadband region impairs performance on both tempo and pitch discrimination. Our results do not support the hypothesis that the lateral, more broadband subregion of the songbird auditory cortex contributes more to processing temporal than spectral information.
Topics: Animals; Auditory Cortex; Songbirds; Auditory Perception; Pitch Discrimination; Acoustic Stimulation; Vocalization, Animal
PubMed: 37270583
DOI: 10.1186/s13041-023-01039-5 -
Neuron Mar 2021Human brain pathways supporting language and declarative memory are thought to have differentiated substantially during evolution. However, cross-species comparisons are...
Human brain pathways supporting language and declarative memory are thought to have differentiated substantially during evolution. However, cross-species comparisons are missing on site-specific effective connectivity between regions important for cognition. We harnessed functional imaging to visualize the effects of direct electrical brain stimulation in macaque monkeys and human neurosurgery patients. We discovered comparable effective connectivity between caudal auditory cortex and both ventro-lateral prefrontal cortex (VLPFC, including area 44) and parahippocampal cortex in both species. Human-specific differences were clearest in the form of stronger hemispheric lateralization effects. In humans, electrical tractography revealed remarkably rapid evoked potentials in VLPFC following auditory cortex stimulation and speech sounds drove VLPFC, consistent with prior evidence in monkeys of direct auditory cortex projections to homologous vocalization-responsive regions. The results identify a common effective connectivity signature in human and nonhuman primates, which from auditory cortex appears equally direct to VLPFC and indirect to the hippocampus. VIDEO ABSTRACT.
Topics: Adolescent; Adult; Animals; Auditory Cortex; Brain Mapping; Electric Stimulation; Female; Frontal Lobe; Humans; Macaca mulatta; Magnetic Resonance Imaging; Male; Middle Aged; Neural Pathways; Parahippocampal Gyrus; Prefrontal Cortex; Species Specificity; Temporal Lobe; Young Adult
PubMed: 33482086
DOI: 10.1016/j.neuron.2020.12.026 -
PLoS Biology Mar 2020Speech perception is mediated by both left and right auditory cortices but with differential sensitivity to specific acoustic information contained in the speech signal....
Speech perception is mediated by both left and right auditory cortices but with differential sensitivity to specific acoustic information contained in the speech signal. A detailed description of this functional asymmetry is missing, and the underlying models are widely debated. We analyzed cortical responses from 96 epilepsy patients with electrode implantation in left or right primary, secondary, and/or association auditory cortex (AAC). We presented short acoustic transients to noninvasively estimate the dynamical properties of multiple functional regions along the auditory cortical hierarchy. We show remarkably similar bimodal spectral response profiles in left and right primary and secondary regions, with evoked activity composed of dynamics in the theta (around 4-8 Hz) and beta-gamma (around 15-40 Hz) ranges. Beyond these first cortical levels of auditory processing, a hemispheric asymmetry emerged, with delta and beta band (3/15 Hz) responsivity prevailing in the right hemisphere and theta and gamma band (6/40 Hz) activity prevailing in the left. This asymmetry is also present during syllables presentation, but the evoked responses in AAC are more heterogeneous, with the co-occurrence of alpha (around 10 Hz) and gamma (>25 Hz) activity bilaterally. These intracranial data provide a more fine-grained and nuanced characterization of cortical auditory processing in the 2 hemispheres, shedding light on the neural dynamics that potentially shape auditory and speech processing at different levels of the cortical hierarchy.
Topics: Acoustic Stimulation; Auditory Cortex; Electrodes, Implanted; Electroencephalography; Epilepsy; Evoked Potentials, Auditory; Female; Functional Laterality; Humans; Male; Speech Perception
PubMed: 32119667
DOI: 10.1371/journal.pbio.3000207 -
NeuroImage Nov 2023Loud acoustic noise from the scanner during functional magnetic resonance imaging (fMRI) can affect functional connectivity (FC) observed in the resting state, but the...
Loud acoustic noise from the scanner during functional magnetic resonance imaging (fMRI) can affect functional connectivity (FC) observed in the resting state, but the exact effect of the MRI acoustic noise on resting state FC is not well understood. Functional ultrasound (fUS) is a neuroimaging method that visualizes brain activity based on relative cerebral blood volume (rCBV), a similar neurovascular coupling response to that measured by fMRI, but without the audible acoustic noise. In this study, we investigated the effects of different acoustic noise levels (silent, 80 dB, and 110 dB) on FC by measuring resting state fUS (rsfUS) in awake mice in an environment similar to fMRI measurement. Then, we compared the results to those of resting state fMRI (rsfMRI) conducted using an 11.7 Tesla scanner. RsfUS experiments revealed a significant reduction in FC between the retrosplenial dysgranular and auditory cortexes (0.56 ± 0.07 at silence vs 0.05 ± 0.05 at 110 dB, p=.01) and a significant increase in FC anticorrelation between the infralimbic and motor cortexes (-0.21 ± 0.08 at silence vs -0.47 ± 0.04 at 110 dB, p=.017) as acoustic noise increased from silence to 80 dB and 110 dB, with increased consistency of FC patterns between rsfUS and rsfMRI being found with the louder noise conditions. Event-related auditory stimulation experiments using fUS showed strong positive rCBV changes (16.5% ± 2.9% at 110 dB) in the auditory cortex, and negative rCBV changes (-6.7% ± 0.8% at 110 dB) in the motor cortex, both being constituents of the brain network that was altered by the presence of acoustic noise in the resting state experiments. Anticorrelation between constituent brain regions of the default mode network (such as the infralimbic cortex) and those of task-positive sensorimotor networks (such as the motor cortex) is known to be an important feature of brain network antagonism, and has been studied as a biological marker of brain disfunction and disease. This study suggests that attention should be paid to the acoustic noise level when using rsfMRI to evaluate the anticorrelation between the default mode network and task-positive sensorimotor network.
Topics: Animals; Mice; Brain Mapping; Brain; Magnetic Resonance Imaging; Auditory Cortex; Noise
PubMed: 37734475
DOI: 10.1016/j.neuroimage.2023.120382 -
Nature Communications Jul 2021The capacity of the brain to encode multiple types of sensory input is key to survival. Yet, how neurons integrate information from multiple sensory pathways and to what...
The capacity of the brain to encode multiple types of sensory input is key to survival. Yet, how neurons integrate information from multiple sensory pathways and to what extent this influences behavior is largely unknown. Using two-photon Ca imaging, optogenetics and electrophysiology in vivo and in vitro, we report the influence of auditory input on sensory encoding in the somatosensory cortex and show its impact on goal-directed behavior. Monosynaptic input from the auditory cortex enhanced dendritic and somatic encoding of tactile stimulation in layer 2/3 (L2/3), but not layer 5 (L5), pyramidal neurons in forepaw somatosensory cortex (S1). During a tactile-based goal-directed task, auditory input increased dendritic activity and reduced reaction time, which was abolished by photoinhibition of auditory cortex projections to forepaw S1. Taken together, these results indicate that dendrites of L2/3 pyramidal neurons encode multisensory information, leading to enhanced neuronal output and reduced response latency during goal-directed behavior.
Topics: Action Potentials; Animals; Auditory Cortex; Dendrites; Electric Stimulation; Electromyography; Goals; Mice, Inbred C57BL; Mice, Transgenic; Optogenetics; Patch-Clamp Techniques; Pyramidal Cells; Somatosensory Cortex; Touch; Mice
PubMed: 34301949
DOI: 10.1038/s41467-021-24754-w -
NeuroImage Oct 2021Neural oscillations are fundamental mechanisms of the human brain that enable coordinated activity of different brain regions during perceptual and cognitive processes....
Neural oscillations are fundamental mechanisms of the human brain that enable coordinated activity of different brain regions during perceptual and cognitive processes. A frontotemporal network generated by means of gamma oscillations and comprising the auditory cortex (AC) and the anterior cingulate cortex (ACC) has been shown to be involved in the cognitively demanding auditory information processing. This study aims to reveal patterns of functional and effective connectivity within this network in healthy subjects by means of simultaneously recorded electroencephalography (EEG) and functional magnetic resonance imaging (fMRI). We simultaneously recorded EEG and fMRI in 28 healthy subjects during the performance of a cognitively demanding auditory choice reaction task. Connectivity between the ACC and AC was analysed employing EEG and fMRI connectivity measures. We found a significant BOLD signal correlation between the ACC and AC, a significant task-dependant increase of fMRI connectivity (gPPI) and a significant increase in functional coupling in the gamma frequency range between these regions (LPS), which was increased in top-down direction (granger analysis). EEG and fMRI connectivity measures were positively correlated. The results of these study point to a role of a top-down influence of the ACC on the AC executed by means of gamma synchronisation. The replication of fMRI connectivity patterns in simultaneously recorded EEG data and the correlation between connectivity measures from both domains found in our study show, that brain connectivity based on the synchronisation of gamma oscillations is mirrored in fMRI connectivity patterns.
Topics: Adult; Auditory Cortex; Auditory Perception; Connectome; Electroencephalography; Electroencephalography Phase Synchronization; Female; Frontal Lobe; Gamma Rays; Gyrus Cinguli; Humans; Magnetic Resonance Imaging; Male; Nerve Net; Thalamus; Young Adult
PubMed: 34174389
DOI: 10.1016/j.neuroimage.2021.118307 -
Hearing Research Mar 2024Afferent inputs from the cochlea transmit auditory information to the central nervous system, where information is processed and passed up the hierarchy, ending in the... (Review)
Review
Afferent inputs from the cochlea transmit auditory information to the central nervous system, where information is processed and passed up the hierarchy, ending in the auditory cortex. Through these brain pathways, spectral and temporal features of sounds are processed and sent to the cortex for perception. There are also many mechanisms in place for modulation of these inputs, with a major source of modulation being based in the medial prefrontal cortex (mPFC). Neurons of the rodent mPFC receive input from the auditory cortex and other regions such as thalamus, hippocampus and basal forebrain, allowing them to encode high-order information about sounds such as context, predictability and valence. The mPFC then exerts control over auditory perception via top-down modulation of the central auditory pathway, altering perception of and responses to sounds. The result is a higher-order control of auditory processing that produces such characteristics as deviance detection, attention, avoidance and fear conditioning. This review summarises connections between mPFC and the primary auditory pathway, responses of mPFC neurons to auditory stimuli, how mPFC outputs shape the perception of sounds, and how changes to these systems during hearing loss and tinnitus may contribute to these conditions.
Topics: Animals; Rodentia; Auditory Perception; Prefrontal Cortex; Auditory Cortex; Auditory Pathways
PubMed: 38271895
DOI: 10.1016/j.heares.2024.108954 -
NeuroImage Sep 2022Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech...
Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners' syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.
Topics: Acoustic Stimulation; Auditory Cortex; Auditory Perception; Hearing; Humans; Phonetics; Speech; Speech Perception
PubMed: 35700949
DOI: 10.1016/j.neuroimage.2022.119375