-
NeuroImage Sep 2022Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech...
Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners' syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.
Topics: Acoustic Stimulation; Auditory Cortex; Auditory Perception; Hearing; Humans; Phonetics; Speech; Speech Perception
PubMed: 35700949
DOI: 10.1016/j.neuroimage.2022.119375 -
Current Biology : CB Dec 2020In many behavioral tasks, cortex enters a desynchronized state where low-frequency fluctuations in population activity are suppressed. The precise behavioral correlates...
In many behavioral tasks, cortex enters a desynchronized state where low-frequency fluctuations in population activity are suppressed. The precise behavioral correlates of desynchronization and its global organization are unclear. One hypothesis holds that desynchronization enhances stimulus coding in the relevant sensory cortex. Another hypothesis holds that desynchronization reflects global arousal, such as task engagement. Here, we trained mice on tasks where task engagement could be distinguished from sensory accuracy. Using widefield calcium imaging, we found that performance-related desynchronization was global and correlated better with engagement than with accuracy. Consistent with this link between desynchronization and engagement, rewards had a long-lasting desynchronizing effect. To determine whether engagement-related state changes depended on the relevant sensory modality, we trained mice on visual and auditory tasks and found that in both cases desynchronization was global, including regions such as somatomotor cortex. We conclude that variations in low-frequency fluctuations are predominately global and related to task engagement.
Topics: Acoustic Stimulation; Animals; Arousal; Auditory Cortex; Cortical Synchronization; Decision Making; Electroencephalography; Female; Male; Mice; Neurons; Optical Imaging; Photic Stimulation; Reward; Stereotaxic Techniques; Visual Cortex
PubMed: 33096037
DOI: 10.1016/j.cub.2020.09.067 -
The Journal of Neuroscience : the... Jun 2022The brain areas that mediate the formation of auditory threat memory and perceptual decisions remain uncertain to date. Candidates include the primary (A1) and secondary...
Frequency-Dependent Plasticity in the Temporal Association Cortex Originates from the Primary Auditory Cortex, and Is Modified by the Secondary Auditory Cortex and the Medial Geniculate Body.
The brain areas that mediate the formation of auditory threat memory and perceptual decisions remain uncertain to date. Candidates include the primary (A1) and secondary (A2) auditory cortex, the medial division of the medial geniculate body (MGm), amygdala, and the temporal association cortex. We used chemogenetic and optogenetic manipulations with and patch-clamp recordings to assess the roles of these brain regions in threat memory learning in female mice. We found that conditioned sound (CS) frequency-dependent plasticity resulted in the formation of auditory threat memory in the temporal association cortex. This neural correlated auditory threat memory depended on CS frequency information from A1 glutamatergic subthreshold monosynaptic inputs, CS lateral inhibition from A2 glutamatergic disynaptic inputs, and non-frequency-specific facilitation from MGm glutamatergic monosynaptic inputs. These results indicate that the A2 and MGm work together in an inhibitory-facilitative role. The ability to recognize specific sounds to avoid predators or seek prey is a useful survival tool. Improving this ability through experiential learning is an added advantage requiring neural plasticity. As an example, humans must learn to distinguish the sound of a car horn, and thus avoid oncoming traffic. Our research discovered that the temporal association cortex can encode this kind of auditory information through tonal receptive field plasticity. In addition, the results revealed the underlying synaptic mechanisms of this process. These results extended our understanding of how meaningful auditory information is processed in an animal's brain.
Topics: Acoustic Stimulation; Amygdala; Animals; Auditory Cortex; Conditioning, Classical; Female; Geniculate Bodies; Mice; Neuronal Plasticity
PubMed: 35613891
DOI: 10.1523/JNEUROSCI.1481-21.2022 -
PloS One 2022The experience of auditory verbal hallucinations (AVH, "hearing voices") in schizophrenia has been found to be associated with reduced auditory cortex activation during...
The experience of auditory verbal hallucinations (AVH, "hearing voices") in schizophrenia has been found to be associated with reduced auditory cortex activation during perception of real auditory stimuli like tones and speech. We re-examined this finding using 46 patients with schizophrenia (23 with frequent AVH and 23 hallucination-free), who underwent fMRI scanning while they heard words, sentences and reversed speech. Twenty-five matched healthy controls were also examined. Perception of words, sentences and reversed speech all elicited activation of the bilateral superior temporal cortex, the inferior and lateral prefrontal cortex, the inferior parietal cortex and the supplementary motor area in the patients and the healthy controls. During the sentence and reversed speech conditions, the schizophrenia patients as a group showed reduced activation in the left primary auditory cortex (Heschl's gyrus) relative to the healthy controls. No differences were found between the patients with and without hallucinations in any condition. This study therefore fails to support previous findings that experience of AVH attenuates speech-perception-related brain activations in the auditory cortex. At the same time, it suggests that schizophrenia patients, regardless of presence of AVH, show reduced activation in the primary auditory cortex during speech perception, a finding which could reflect an early information processing deficit in the disorder.
Topics: Humans; Schizophrenia; Speech Perception; Hallucinations; Brain; Temporal Lobe; Magnetic Resonance Imaging; Auditory Cortex; Auditory Perception
PubMed: 36525414
DOI: 10.1371/journal.pone.0276975 -
Human Brain Mapping Dec 2021When listening to music, pitch deviations are more salient and elicit stronger prediction error responses when the melodic context is predictable and when the listener...
When listening to music, pitch deviations are more salient and elicit stronger prediction error responses when the melodic context is predictable and when the listener is a musician. Yet, the neuronal dynamics and changes in connectivity underlying such effects remain unclear. Here, we employed dynamic causal modeling (DCM) to investigate whether the magnetic mismatch negativity response (MMNm)-and its modulation by context predictability and musical expertise-are associated with enhanced neural gain of auditory areas, as a plausible mechanism for encoding precision-weighted prediction errors. Using Bayesian model comparison, we asked whether models with intrinsic connections within primary auditory cortex (A1) and superior temporal gyrus (STG)-typically related to gain control-or extrinsic connections between A1 and STG-typically related to propagation of prediction and error signals-better explained magnetoencephalography responses. We found that, compared to regular sounds, out-of-tune pitch deviations were associated with lower intrinsic (inhibitory) connectivity in A1 and STG, and lower backward (inhibitory) connectivity from STG to A1, consistent with disinhibition and enhanced neural gain in these auditory areas. More predictable melodies were associated with disinhibition in right A1, while musicianship was associated with disinhibition in left A1 and reduced connectivity from STG to left A1. These results indicate that musicianship and melodic predictability, as well as pitch deviations themselves, enhance neural gain in auditory cortex during deviance detection. Our findings are consistent with predictive processing theories suggesting that precise and informative error signals are selected by the brain for subsequent hierarchical processing.
Topics: Adult; Auditory Cortex; Bayes Theorem; Female; Functional Neuroimaging; Humans; Magnetoencephalography; Male; Models, Theoretical; Music; Pitch Perception; Young Adult
PubMed: 34459062
DOI: 10.1002/hbm.25638 -
Cerebral Cortex (New York, N.Y. : 1991) Feb 2023Voice signaling is integral to human communication, and a cortical voice area seemed to support the discrimination of voices from other auditory objects. This large...
Voice signaling is integral to human communication, and a cortical voice area seemed to support the discrimination of voices from other auditory objects. This large cortical voice area in the auditory cortex (AC) was suggested to process voices selectively, but its functional differentiation remained elusive. We used neuroimaging while humans processed voices and nonvoice sounds, and artificial sounds that mimicked certain voice sound features. First and surprisingly, specific auditory cortical voice processing beyond basic acoustic sound analyses is only supported by a very small portion of the originally described voice area in higher-order AC located centrally in superior Te3. Second, besides this core voice processing area, large parts of the remaining voice area in low- and higher-order AC only accessorily process voices and might primarily pick up nonspecific psychoacoustic differences between voices and nonvoices. Third, a specific subfield of low-order AC seems to specifically decode acoustic sound features that are relevant but not exclusive for voice detection. Taken together, the previously defined voice area might have been overestimated since cortical support for human voice processing seems rather restricted. Cortical voice processing also seems to be functionally more diverse and embedded in broader functional principles of the human auditory system.
Topics: Humans; Auditory Cortex; Acoustic Stimulation; Auditory Perception; Voice; Sound; Magnetic Resonance Imaging
PubMed: 35348635
DOI: 10.1093/cercor/bhac128 -
NeuroImage Jun 2023When sensory input conveys rhythmic regularity, we can form predictions about the timing of upcoming events. Although rhythm processing capacities differ considerably...
When sensory input conveys rhythmic regularity, we can form predictions about the timing of upcoming events. Although rhythm processing capacities differ considerably between individuals, these differences are often obscured by participant- and trial-level data averaging procedures in M/EEG research. Here, we systematically assessed neurophysiological variability displayed by individuals listening to isochronous (1.54 Hz) equitone sequences interspersed with unexpected (amplitude-attenuated) deviant tones. Our approach aimed at revealing time-varying adaptive neural mechanisms for sampling the acoustic environment at multiple timescales. Rhythm tracking analyses confirmed that individuals encode temporal regularities and form temporal expectations, as indicated in delta-band (1.54 Hz) power and its anticipatory phase alignment to expected tone onsets. Zooming into tone- and participant-level data, we further characterized intra- and inter-individual variabilities in phase-alignment across auditory sequences. Further, individual modeling of beta-band tone-locked responses showed that a subset of auditory sequences was sampled rhythmically by superimposing binary (strong-weak; S-w), ternary (S-w-w) and mixed accentuation patterns. In these sequences, neural responses to standard and deviant tones were modulated by a binary accentuation pattern, thus pointing towards a mechanism of dynamic attending. Altogether, the current results point toward complementary roles of delta- and beta-band activity in rhythm processing and further highlight diverse and adaptive mechanisms to track and sample the acoustic environment at multiple timescales, even in the absence of task-specific instructions.
Topics: Humans; Electroencephalography; Acoustic Stimulation; Auditory Cortex; Auditory Perception; Acoustics
PubMed: 37028735
DOI: 10.1016/j.neuroimage.2023.120090 -
The Journal of Neuroscience : the... Oct 2023Comparing expectation with experience is an important neural computation performed throughout the brain and is a hallmark of predictive processing. Experiments that...
Comparing expectation with experience is an important neural computation performed throughout the brain and is a hallmark of predictive processing. Experiments that alter the sensory outcome of an animal's behavior reveal enhanced neural responses to unexpected self-generated stimuli, indicating that populations of neurons in sensory cortex may reflect prediction errors (PEs), mismatches between expectation and experience. However, enhanced neural responses to self-generated stimuli could also arise through nonpredictive mechanisms, such as the movement-based facilitation of a neuron's inherent sound responses. If sensory prediction error neurons exist in sensory cortex, it is unknown whether they manifest as general error responses, or respond with specificity to errors in distinct stimulus dimensions. To answer these questions, we trained mice of either sex to expect the outcome of a simple sound-generating behavior and recorded auditory cortex activity as mice heard either the expected sound or sounds that deviated from expectation in one of multiple distinct dimensions. Our data reveal that the auditory cortex learns to suppress responses to self-generated sounds along multiple acoustic dimensions simultaneously. We identify a distinct population of auditory cortex neurons that are not responsive to passive sounds or to the expected sound but that encode prediction errors. These prediction error neurons are abundant only in animals with a learned motor-sensory expectation, and encode one or two specific violations rather than a generic error signal. Together, these findings reveal that cortical predictions about self-generated sounds have specificity in multiple simultaneous dimensions and that cortical prediction error neurons encode specific violations from expectation. Audette et. al record neural activity in the auditory cortex while mice perform a sound-generating forelimb movement and measure neural responses to sounds that violate an animal's expectation in different ways. They find that predictions about self-generated sounds are highly specific across multiple stimulus dimensions and that a population of typically nonsound-responsive neurons respond to sounds that violate an animal's expectation in a specific way. These results identify specific prediction error (PE) signals in the mouse auditory cortex and suggest that errors may be calculated early in sensory processing.
Topics: Animals; Mice; Auditory Cortex; Auditory Perception; Acoustic Stimulation; Sensory Receptor Cells; Sound
PubMed: 37699716
DOI: 10.1523/JNEUROSCI.0512-23.2023 -
The Journal of Neuroscience : the... Apr 2023In everyday life, we integrate visual and auditory information in routine tasks such as navigation and communication. While concurrent sound can improve visual...
In everyday life, we integrate visual and auditory information in routine tasks such as navigation and communication. While concurrent sound can improve visual perception, the neuronal correlates of audiovisual integration are not fully understood. Specifically, it remains unclear whether neuronal firing patters in the primary visual cortex (V1) of awake animals demonstrate similar sound-induced improvement in visual discriminability. Furthermore, presentation of sound is associated with movement in the subjects, but little is understood about whether and how sound-associated movement affects audiovisual integration in V1. Here, we investigated how sound and movement interact to modulate V1 visual responses in awake, head-fixed mice and whether this interaction improves neuronal encoding of the visual stimulus. We presented visual drifting gratings with and without simultaneous auditory white noise to awake mice while recording mouse movement and V1 neuronal activity. Sound modulated activity of 80% of light-responsive neurons, with 95% of neurons increasing activity when the auditory stimulus was present. A generalized linear model (GLM) revealed that sound and movement had distinct and complementary effects of the neuronal visual responses. Furthermore, decoding of the visual stimulus from the neuronal activity was improved with sound, an effect that persisted even when controlling for movement. These results demonstrate that sound and movement modulate visual responses in complementary ways, improving neuronal representation of the visual stimulus. This study clarifies the role of movement as a potential confound in neuronal audiovisual responses and expands our knowledge of how multimodal processing is mediated at a neuronal level in the awake brain. Sound and movement are both known to modulate visual responses in the primary visual cortex; however, sound-induced movement has largely remained unaccounted for as a potential confound in audiovisual studies in awake animals. Here, authors found that sound and movement both modulate visual responses in an important visual brain area, the primary visual cortex, in distinct, yet complementary ways. Furthermore, sound improved encoding of the visual stimulus even when accounting for movement. This study reconciles contrasting theories on the mechanism underlying audiovisual integration and asserts the primary visual cortex as a key brain region participating in tripartite sensory interactions.
Topics: Mice; Animals; Primary Visual Cortex; Visual Perception; Sound; Movement; Neurons; Auditory Cortex; Photic Stimulation
PubMed: 36944489
DOI: 10.1523/JNEUROSCI.2444-21.2023 -
Frontiers in Neural Circuits 2021Anatomical and physiological studies have described the cortex as a six-layer structure that receives, elaborates, and sends out information exclusively as excitatory...
Anatomical and physiological studies have described the cortex as a six-layer structure that receives, elaborates, and sends out information exclusively as excitatory output to cortical and subcortical regions. This concept has increasingly been challenged by several anatomical and functional studies that showed that direct inhibitory cortical outputs are also a common feature of the sensory and motor cortices. Similar to their excitatory counterparts, subsets of Somatostatin- and Parvalbumin-expressing neurons have been shown to innervate distal targets like the sensory and motor striatum and the contralateral cortex. However, no evidence of long-range VIP-expressing neurons, the third major class of GABAergic cortical inhibitory neurons, has been shown in such cortical regions. Here, using anatomical anterograde and retrograde viral tracing, we tested the hypothesis that VIP-expressing neurons of the mouse auditory and motor cortices can also send long-range projections to cortical and subcortical areas. We were able to demonstrate, for the first time, that VIP-expressing neurons of the auditory cortex can reach not only the contralateral auditory cortex and the ipsilateral striatum and amygdala, as shown for Somatostatin- and Parvalbumin-expressing long-range neurons, but also the medial geniculate body and both superior and inferior colliculus. We also demonstrate that VIP-expressing neurons of the motor cortex send long-range GABAergic projections to the dorsal striatum and contralateral cortex. Because of its presence in two such disparate cortical areas, this would suggest that the long-range VIP projection is likely a general feature of the cortex's network.
Topics: Animals; Auditory Cortex; Auditory Pathways; Female; GABAergic Neurons; Male; Mice; Mice, Transgenic; Motor Cortex; Organ Culture Techniques; Vasoactive Intestinal Peptide
PubMed: 34366798
DOI: 10.3389/fncir.2021.714780