-
NeuroImage Jul 2023The infant auditory system rapidly matures across the first years of life, with a primary goal of obtaining ever-more-accurate real-time representations of the external...
The infant auditory system rapidly matures across the first years of life, with a primary goal of obtaining ever-more-accurate real-time representations of the external world. Our understanding of how left and right auditory cortex neural processes develop during infancy, however, is meager, with few studies having the statistical power to detect potential hemisphere and sex differences in primary/secondary auditory cortex maturation. Using infant magnetoencephalography (MEG) and a cross-sectional study design, left and right auditory cortex P2m responses to pure tones were examined in 114 typically developing infants and toddlers (66 males, 2 to 24 months). Non-linear maturation of P2m latency was observed, with P2m latencies decreasing rapidly as a function of age during the first year of life, followed by slower changes between 12 and 24 months. Whereas in younger infants auditory tones were encoded more slowly in the left than right hemisphere, similar left and right P2m latencies were observed by ∼21 months of age due to faster maturation rate in the left than right hemisphere. No sex differences in the maturation of the P2m responses were observed. Finally, an earlier left than right hemisphere P2m latency predicted better language performance in older infants (12 to 24 months). Findings indicate the need to consider hemisphere when examining the maturation of auditory cortex neural activity in infants and toddlers and show that the pattern of left-right hemisphere P2m maturation is associated with language performance.
Topics: Male; Humans; Infant; Aged; Auditory Cortex; Evoked Potentials, Auditory; Cross-Sectional Studies; Magnetoencephalography; Acoustic Stimulation
PubMed: 37178820
DOI: 10.1016/j.neuroimage.2023.120163 -
NeuroImage Aug 2020Keeping time is fundamental for our everyday existence. Various isochronous activities, such as locomotion, require us to use internal timekeeping. This phenomenon comes...
Keeping time is fundamental for our everyday existence. Various isochronous activities, such as locomotion, require us to use internal timekeeping. This phenomenon comes into play also in other human pursuits such as dance and music. When listening to music, we spontaneously perceive and predict its beat. The process of beat perception comprises both beat inference and beat maintenance, their relative importance depending on the salience of beat in the music. To study functional connectivity associated with these processes in a naturalistic situation, we used functional magnetic resonance imaging to measure brain responses of participants while they were listening to a piece of music containing strong contrasts in beat salience. Subsequently, we utilized dynamic graph analysis and psychophysiological interactions (PPI) analysis in connection with computational modelling of beat salience to investigate how functional connectivity manifests these processes. As the main effect, correlation analyses between the obtained dynamic graph measures and the beat salience measure revealed increased centrality in auditory-motor cortices, cerebellum, and extrastriate visual areas during low beat salience, whereas regions of the default mode- and central executive networks displayed high centrality during high beat salience. PPI analyses revealed partial dissociation of functional networks belonging to this pathway indicating complementary neural mechanisms crucial in beat inference and maintenance, processes pivotal for extracting and predicting temporal regularities in our environment.
Topics: Acoustic Stimulation; Adult; Auditory Cortex; Auditory Perception; Cerebellum; Connectome; Female; Humans; Magnetic Resonance Imaging; Male; Motor Cortex; Music; Periodicity; Young Adult
PubMed: 31525500
DOI: 10.1016/j.neuroimage.2019.116191 -
Current Biology : CB Oct 2021Vocal communication signals can provide listeners with information about the signaler and elicit motivated responses. Auditory cortical and mesolimbic reward circuits...
Vocal communication signals can provide listeners with information about the signaler and elicit motivated responses. Auditory cortical and mesolimbic reward circuits are often considered to have distinct roles in these processes, with auditory cortical circuits responsible for detecting and discriminating sounds and mesolimbic circuits responsible for ascribing salience and modulating preference for those sounds. Here, we investigated whether dopamine within auditory cortical circuits themselves can shape the incentive salience of a vocal signal. Female zebra finches demonstrate natural preferences for vocal signals produced by males ("songs"), and we found that brief pairing of passive song playback with pharmacological dopamine manipulations in the secondary auditory cortex significantly altered song preferences. In particular, pairing passive song playback with retrodialysis of dopamine agonists into the auditory cortex enhanced preferences for less-preferred songs. Plasticity of song preferences by dopamine persisted for at least 1 week and was mediated by D1 receptors. In contrast, song preferences were not shaped by norepinephrine. In line with this, while we found that the ventral tegmental area, substantia nigra pars compacta, and locus coeruleus all project to the secondary auditory cortex, only dopamine-producing neurons in the ventral tegmental area differentially responded to preferred versus less-preferred songs. In contrast, norepinephrine neurons in the locus coeruleus increased expression of activity-dependent neural markers for both preferred and less-preferred songs. These data suggest that dopamine acting directly in sensory-processing areas can shape the incentive salience of communication signals.
Topics: Acoustic Stimulation; Animals; Auditory Cortex; Auditory Perception; Dopamine; Female; Finches; Learning; Male; Norepinephrine; Vocalization, Animal
PubMed: 34450091
DOI: 10.1016/j.cub.2021.08.005 -
ELife Feb 2022Our brains constantly generate predictions of sensory input that are compared with actual inputs, propagate the prediction-errors through a hierarchy of brain regions,...
Our brains constantly generate predictions of sensory input that are compared with actual inputs, propagate the prediction-errors through a hierarchy of brain regions, and subsequently update the internal predictions of the world. However, the essential feature of predictive coding, the notion of hierarchical depth and its neural mechanisms, remains largely unexplored. Here, we investigated the hierarchical depth of predictive auditory processing by combining functional magnetic resonance imaging (fMRI) and high-density whole-brain electrocorticography (ECoG) in marmoset monkeys during an auditory local-global paradigm in which the temporal regularities of the stimuli were designed at two hierarchical levels. The prediction-errors and prediction updates were examined as neural responses to auditory mismatches and omissions. Using fMRI, we identified a hierarchical gradient along the auditory pathway: midbrain and sensory regions represented local, shorter-time-scale predictive processing followed by associative auditory regions, whereas anterior temporal and prefrontal areas represented global, longer-time-scale sequence processing. The complementary ECoG recordings confirmed the activations at cortical surface areas and further differentiated the signals of prediction-error and update, which were transmitted via putative bottom-up γ and top-down β oscillations, respectively. Furthermore, omission responses caused by absence of input, reflecting solely the two levels of prediction signals that are unique to the hierarchical predictive coding framework, demonstrated the hierarchical top-down process of predictions in the auditory, temporal, and prefrontal areas. Thus, our findings support the hierarchical predictive coding framework, and outline how neural networks and spatiotemporal dynamics are used to represent and arrange a hierarchical structure of auditory sequences in the marmoset brain.
Topics: Animals; Auditory Cortex; Auditory Perception; Brain; Callithrix; Evoked Potentials, Auditory
PubMed: 35174784
DOI: 10.7554/eLife.74653 -
Cell Sep 2021Speech perception is thought to rely on a cortical feedforward serial transformation of acoustic into linguistic representations. Using intracranial recordings across...
Speech perception is thought to rely on a cortical feedforward serial transformation of acoustic into linguistic representations. Using intracranial recordings across the entire human auditory cortex, electrocortical stimulation, and surgical ablation, we show that cortical processing across areas is not consistent with a serial hierarchical organization. Instead, response latency and receptive field analyses demonstrate parallel and distinct information processing in the primary and nonprimary auditory cortices. This functional dissociation was also observed where stimulation of the primary auditory cortex evokes auditory hallucination but does not distort or interfere with speech perception. Opposite effects were observed during stimulation of nonprimary cortex in superior temporal gyrus. Ablation of the primary auditory cortex does not affect speech perception. These results establish a distributed functional organization of parallel information processing throughout the human auditory cortex and demonstrate an essential independent role for nonprimary auditory cortex in speech processing.
Topics: Audiometry, Pure-Tone; Auditory Cortex; Electrodes; Electronic Data Processing; Humans; Phonetics; Pitch Perception; Reaction Time; Speech; Temporal Lobe
PubMed: 34411517
DOI: 10.1016/j.cell.2021.07.019 -
Journal of Neurophysiology Jun 2022Sounds enhance our ability to detect, localize, and respond to co-occurring visual targets. Research suggests that sounds improve visual processing by resetting the...
Sounds enhance our ability to detect, localize, and respond to co-occurring visual targets. Research suggests that sounds improve visual processing by resetting the phase of ongoing oscillations in visual cortex. However, it remains unclear what information is relayed from the auditory system to visual areas and if sounds modulate visual activity even in the absence of visual stimuli (e.g., during passive listening). Using intracranial electroencephalography (iEEG) in humans, we examined the sensitivity of visual cortex to three forms of auditory information during a passive listening task: auditory onset responses, auditory offset responses, and rhythmic entrainment to sounds. Because some auditory neurons respond to both sound onsets and offsets, visual timing and duration processing may benefit from each. In addition, if auditory entrainment information is relayed to visual cortex, it could support the processing of complex stimulus dynamics that are aligned between auditory and visual stimuli. Results demonstrate that in visual cortex, amplitude-modulated sounds elicited transient onset and offset responses in multiple areas, but no entrainment to sound modulation frequencies. These findings suggest that activity in visual cortex (as measured with iEEG in response to auditory stimuli) may not be affected by temporally fine-grained auditory stimulus dynamics during passive listening (though it remains possible that this signal may be observable with simultaneous auditory-visual stimuli). Moreover, auditory responses were maximal in low-level visual cortex, potentially implicating a direct pathway for rapid interactions between auditory and visual cortices. This mechanism may facilitate perception by time-locking visual computations to environmental events marked by auditory discontinuities. Using intracranial electroencephalography (iEEG) in humans during a passive listening task, we demonstrate that sounds modulate activity in visual cortex at both the onset and offset of sounds, which likely supports visual timing and duration processing. However, more complex auditory rate information did not affect visual activity. These findings are based on one of the largest multisensory iEEG studies to date and reveal the type of information transmitted between auditory and visual regions.
Topics: Acoustic Stimulation; Auditory Cortex; Auditory Perception; Humans; Sound; Visual Cortex; Visual Perception
PubMed: 35507478
DOI: 10.1152/jn.00164.2021 -
Brain Stimulation 2022The exact architecture of the human auditory cortex remains a subject of debate, with discrepancies between functional and microstructural studies. In a hierarchical...
BACKGROUND
The exact architecture of the human auditory cortex remains a subject of debate, with discrepancies between functional and microstructural studies. In a hierarchical framework for sensory perception, simple sound perception is expected to take place in the primary auditory cortex, while the processing of complex, or more integrated perceptions is proposed to rely on associative and higher-order cortices.
OBJECTIVES
We hypothesize that auditory symptoms induced by direct electrical stimulation (DES) offer a window into the architecture of the brain networks involved in auditory hallucinations and illusions. The intracranial recordings of these evoked perceptions of varying levels of integration provide the evidence to discuss the theoretical model.
METHODS
We analyzed SEEG recordings from 50 epileptic patients presenting auditory symptoms induced by DES. First, using the Juelich cytoarchitectonic parcellation, we quantified which regions induced auditory symptoms when stimulated (ROI approach). Then, for each evoked auditory symptom type (illusion or hallucination), we mapped the cortical networks showing concurrent high-frequency activity modulation (HFA approach).
RESULTS
Although on average, illusions were found more laterally and hallucinations more posteromedially in the temporal lobe, both perceptions were elicited in all levels of the sensory hierarchy, with mixed responses found in the overlap. The spatial range was larger for illusions, both in the ROI and HFA approaches. The limbic system was specific to the hallucinations network, and the inferior parietal lobule was specific to the illusions network.
DISCUSSION
Our results confirm a network-based organization underlying conscious sound perception, for both simple and complex components. While symptom localization is interesting from an epilepsy semiology perspective, the hallucination-specific modulation of the limbic system is particularly relevant to tinnitus and schizophrenia.
Topics: Acoustic Stimulation; Auditory Cortex; Brain Mapping; Electric Stimulation; Electroencephalography; Epilepsy; Hallucinations; Humans; Illusions
PubMed: 35952963
DOI: 10.1016/j.brs.2022.08.002 -
Neuron Dec 2019Driving perception by direct activation of neural ensembles in cortex is a necessary step for achieving a causal understanding of the neural code for auditory perception...
Driving perception by direct activation of neural ensembles in cortex is a necessary step for achieving a causal understanding of the neural code for auditory perception and developing central sensory rehabilitation methods. Here, using optogenetic manipulations during an auditory discrimination task in mice, we show that auditory cortex can be short-circuited by coarser pathways for simple sound identification. Yet when the sensory decision becomes more complex, involving temporal integration of information, auditory cortex activity is required for sound discrimination and targeted activation of specific cortical ensembles changes perceptual decisions, as predicted by our readout of the cortical code. Hence, auditory cortex representations contribute to sound discriminations by refining decisions from parallel routes.
Topics: Animals; Auditory Cortex; Auditory Pathways; Auditory Perception; Female; Mice
PubMed: 31727548
DOI: 10.1016/j.neuron.2019.09.043 -
NeuroImage Apr 2023Despite its prominence in learning and memory, hippocampal influence in early auditory processing centers remains unknown. Here, we examined how hippocampal activity...
Despite its prominence in learning and memory, hippocampal influence in early auditory processing centers remains unknown. Here, we examined how hippocampal activity modulates sound-evoked responses in the auditory midbrain and thalamus using optogenetics and functional MRI (fMRI) in rodents. Ventral hippocampus (vHP) excitatory neuron stimulation at 5 Hz evoked robust hippocampal activity that propagates to the primary auditory cortex. We then tested 5 Hz vHP stimulation paired with either natural vocalizations or artificial/noise acoustic stimuli. vHP stimulation enhanced auditory responses to vocalizations (with a negative or positive valence) in the inferior colliculus, medial geniculate body, and auditory cortex, but not to their temporally reversed counterparts (artificial sounds) or broadband noise. Meanwhile, pharmacological vHP inactivation diminished response selectivity to vocalizations. These results directly reveal the large-scale hippocampal participation in natural sound processing at early centers of the ascending auditory pathway. They expand our present understanding of hippocampus in global auditory networks.
Topics: Inferior Colliculi; Auditory Pathways; Auditory Cortex; Acoustic Stimulation; Auditory Perception; Geniculate Bodies; Hippocampus
PubMed: 36828157
DOI: 10.1016/j.neuroimage.2023.119943 -
Proceedings of the National Academy of... Jul 2021The perception of sensory events can be enhanced or suppressed by the surrounding spatial and temporal context in ways that facilitate the detection of novel objects and...
The perception of sensory events can be enhanced or suppressed by the surrounding spatial and temporal context in ways that facilitate the detection of novel objects and contribute to the perceptual constancy of those objects under variable conditions. In the auditory system, the phenomenon known as auditory enhancement reflects a general principle of contrast enhancement, in which a target sound embedded within a background sound becomes perceptually more salient if the background is presented first by itself. This effect is highly robust, producing an effective enhancement of the target of up to 25 dB (more than two orders of magnitude in intensity), depending on the task. Despite the importance of the effect, neural correlates of auditory contrast enhancement have yet to be identified in humans. Here, we used the auditory steady-state response to probe the neural representation of a target sound under conditions of enhancement. The probe was simultaneously modulated in amplitude with two modulation frequencies to distinguish cortical from subcortical responses. We found robust correlates for neural enhancement in the auditory cortical, but not subcortical, responses. Our findings provide empirical support for a previously unverified theory of auditory enhancement based on neural adaptation of inhibition and point to approaches for improving sensory prostheses for hearing loss, such as hearing aids and cochlear implants.
Topics: Acoustic Stimulation; Adolescent; Adult; Auditory Cortex; Auditory Perception; Auditory Threshold; Behavior; Electroencephalography; Female; Hearing; Humans; Male; Sound; Young Adult
PubMed: 34266949
DOI: 10.1073/pnas.2024794118