-
Vision Research Nov 2021Saccadic eye movements can drastically affect motion perception: during saccades, the stationary surround is swept rapidly across the retina and contrast sensitivity is...
Saccadic eye movements can drastically affect motion perception: during saccades, the stationary surround is swept rapidly across the retina and contrast sensitivity is suppressed. However, after saccades, contrast sensitivity is enhanced for color and high-spatial frequency stimuli and reflexive tracking movements known as ocular following responses (OFR) are enhanced in response to large field motion. Additionally, OFR and postsaccadic enhancement of neural activity in primate motion processing areas are well correlated. It is not yet known how this postsaccadic enhancement arises. Therefore, we tested if the enhancement can be explained by changes in the balance of centre-surround antagonism in motion processing, where spatial summation is favoured at low contrasts and surround suppression is favoured at high contrasts. We found motion perception was selectively enhanced immediately after saccades for high spatial frequency stimuli, consistent with previously reported selective postsaccadic enhancement of contrast sensitivity for flashed high spatial frequency stimuli. The observed enhancement was also associated with changes in spatial summation and suppression, as well as contrast facilitation and inhibition, suggesting that motion processing is augmented to maximise visual perception immediately after saccades. The results highlight that spatial and contrast properties of underlying neural mechanisms for motion processing can be affected by an antecedent saccade for highly detailed stimuli and are in line with studies that show behavioural and neuronal enhancement of motion processing in non-human primates.
Topics: Animals; Motion Perception; Neurons; Photic Stimulation; Saccades; Vision, Ocular; Visual Perception
PubMed: 34280816
DOI: 10.1016/j.visres.2021.06.011 -
Current Biology : CB Jun 2022Perception in multiple sensory modalities is an active process that involves exploratory behaviors. In humans and other primates, vision results from sensory sampling...
Perception in multiple sensory modalities is an active process that involves exploratory behaviors. In humans and other primates, vision results from sensory sampling guided by saccadic eye movements. Saccades are known to modulate visual perception, and a corollary discharge signal associated with saccades appears to establish a sense of visual stability. Neural recordings have shown that saccades also modulate activity widely across the brain. To investigate the neural basis of saccadic effects on perception, simultaneous recordings from multiple neurons in area V1 were made as animals performed a contrast detection task. Perceptual and neural measures were compared when the animal made real saccades that brought a stimulus into V1 receptive fields and when simulated saccades were made (identical retinal stimulation but no eye movement). When real saccades were made and low spatial frequency stimuli were presented, we observed a reduction in both perceptual sensitivity and neural activity compared with simulated saccades; conversely, with higher spatial frequency stimuli, saccades increased visual sensitivity and neural activity. The performance of neural decoders, which used the activity of the population of simultaneously recorded neurons, showed saccade effects on sensitivity that mirrored the frequency-dependent perceptual changes, suggesting that the V1 population activity could support the perceptual effects. A minority of V1 neurons had significant choice probabilities, and the saccades decreased both average choice probability and pairwise noise correlations. Taken together, the findings suggest that a signal related to saccadic eye movements alters V1 spiking to increase the independence of spiking neurons and bias the system toward processing higher spatial frequencies, presumably to enhance object recognition. The effects of saccades on visual perception and noise correlations appear to parallel effects observed in other sensory modalities, suggesting a general principle of active sensory processing.
Topics: Animals; Neurons; Photic Stimulation; Reaction Time; Saccades; Vision, Ocular; Visual Perception
PubMed: 35584697
DOI: 10.1016/j.cub.2022.04.067 -
The European Journal of Neuroscience Aug 2022While atypical sensory perception is reported among individuals with autism spectrum disorder (ASD), the underlying neural mechanisms of autism that give rise to...
While atypical sensory perception is reported among individuals with autism spectrum disorder (ASD), the underlying neural mechanisms of autism that give rise to disruptions in sensory perception remain unclear. We developed a neural model with key physiological, functional and neuroanatomical parameters to investigate mechanisms underlying the range of representations of visual illusions related to orientation perception in typically developed subjects compared to individuals with ASD. Our results showed that two theorized autistic traits, excitation/inhibition imbalance and weakening of top-down modulation, could be potential candidates for reduced susceptibility to some visual illusions. Parametric correlation between cortical suppression, balance of excitation/inhibition, feedback from higher visual areas on one hand and susceptibility to a class of visual illusions related to orientation perception on the other hand provide the opportunity to investigate the contribution and complex interactions of distinct sensory processing mechanisms in ASD. The novel approach used in this study can be used to link behavioural, functional and neuropathological studies; estimate and predict perceptual and cognitive heterogeneity in ASD; and form a basis for the development of novel diagnostics and therapeutics.
Topics: Autism Spectrum Disorder; Autistic Disorder; Humans; Illusions; Visual Perception
PubMed: 35701859
DOI: 10.1111/ejn.15739 -
PloS One 2024In visual perception and information processing, a cascade of associations is hypothesized to flow from the structure of the visual stimulus to neural activity along the...
In visual perception and information processing, a cascade of associations is hypothesized to flow from the structure of the visual stimulus to neural activity along the retinogeniculostriate visual system to behavior and action. Do visual perception and information processing adhere to this cascade near the beginning of life? To date, this three-stage hypothetical cascade has not been comprehensively tested in infants. In two related experiments, we attempted to expose this cascade in 6-month-old infants. Specifically, we presented infants with two levels of visual stimulus intensity, we measured electrical activity at the infant cortex, and we assessed infants' preferential looking behavior. Chromatic saturation provided a convenient stimulus dimension to test the cascade because greater saturation is known to excite increased activity in the primate visual system and is generally hypothesized to stimulate visual preference. Experiment 1 revealed that infants prefer (look longer) at the more saturated of two colors otherwise matched in hue and brightness. Experiment 2 showed increased aggregate neural cortical excitation in infants (and adults) to the more saturated of the same pair of colors. Thus, experiments 1 and 2 taken together confirm a cascade: Visual stimulation of relatively greater intensity evokes relatively greater levels of bioelectrical cortical activity which in turn is associated with relatively greater visual attention. As this cascade obtains near the beginning of life, it helps to account for early visual preferences and visual information processing.
Topics: Humans; Infant; Photic Stimulation; Visual Perception; Female; Male; Visual Cortex; Adult
PubMed: 38889176
DOI: 10.1371/journal.pone.0302852 -
PloS One 2021Audio-visual integration relies on temporal synchrony between visual and auditory inputs. However, differences in traveling and transmitting speeds between visual and...
Audio-visual integration relies on temporal synchrony between visual and auditory inputs. However, differences in traveling and transmitting speeds between visual and auditory stimuli exist; therefore, audio-visual synchrony perception exhibits flexible functions. The processing speed of visual stimuli affects the perception of audio-visual synchrony. The present study examined the effects of visual fields, in which visual stimuli are presented, for the processing of audio-visual temporal synchrony. The point of subjective simultaneity, the temporal binding window, and the rapid recalibration effect were measured using temporal order judgment, simultaneity judgment, and stream/bounce perception, because different mechanisms of temporal processing have been suggested among these three paradigms. The results indicate that auditory stimuli should be presented earlier for visual stimuli in the central visual field than in the peripheral visual field condition in order to perceive subjective simultaneity in the temporal order judgment task conducted in this study. Meanwhile, the subjective simultaneity bandwidth was broader in the central visual field than in the peripheral visual field during the simultaneity judgment task. In the stream/bounce perception task, neither the point of subjective simultaneity nor the temporal binding window differed between the two types of visual fields. Moreover, rapid recalibration occurred in both visual fields during the simultaneity judgment tasks. However, during the temporal order judgment task and stream/bounce perception, rapid recalibration occurred only in the central visual field. These results suggest that differences in visual processing speed based on the visual field modulate the temporal processing of audio-visual stimuli. Furthermore, these three tasks, temporal order judgment, simultaneity judgment, and stream/bounce perception, each have distinct functional characteristics for audio-visual synchrony perception. Future studies are necessary to confirm the effects of compensation regarding differences in the temporal resolution of the visual field in later cortical visual pathways on visual field differences in audio-visual temporal synchrony.
Topics: Acoustic Stimulation; Adaptation, Physiological; Adult; Auditory Perception; Female; Humans; Judgment; Male; Photic Stimulation; Time Perception; Visual Fields; Visual Perception; Young Adult
PubMed: 34914735
DOI: 10.1371/journal.pone.0261129 -
The Journal of Neuroscience : the... Jun 2023Does our perception of an object change once we discover what function it serves? We showed human participants ( = 48, 31 females and 17 males) pictures of unfamiliar...
Does our perception of an object change once we discover what function it serves? We showed human participants ( = 48, 31 females and 17 males) pictures of unfamiliar objects either together with keywords matching their function, leading to semantically informed perception, or together with nonmatching keywords, resulting in uninformed perception. We measured event-related potentials to investigate at which stages in the visual processing hierarchy these two types of object perception differed from one another. We found that semantically informed compared with uninformed perception was associated with larger amplitudes in the N170 component (150-200 ms), reduced amplitudes in the N400 component (400-700 ms), and a late decrease in alpha/beta band power. When the same objects were presented once more without any information, the N400 and event-related power effects persisted, and we also observed enlarged amplitudes in the P1 component (100-150 ms) in response to objects for which semantically informed perception had taken place. Consistent with previous work, this suggests that obtaining semantic information about previously unfamiliar objects alters aspects of their lower-level visual perception (P1 component), higher-level visual perception (N170 component), and semantic processing (N400 component, event-related power). Our study is the first to show that such effects occur instantly after semantic information has been provided for the first time, without requiring extensive learning. There has been a long-standing debate about whether or not higher-level cognitive capacities, such as semantic knowledge, can influence lower-level perceptual processing in a top-down fashion. Here we could show, for the first time, that information about the function of previously unfamiliar objects immediately influences cortical processing within less than 200 ms. Of note, this influence does not require training or experience with the objects and related semantic information. Therefore, our study is the first to show effects of cognition on perception while ruling out the possibility that prior knowledge merely acts by preactivating or altering stored visual representations. Instead, this knowledge seems to alter perception online, thus providing a compelling case against the impenetrability of perception by cognition.
Topics: Humans; Male; Female; Evoked Potentials; Semantics; Electroencephalography; Visual Perception; Learning
PubMed: 37286353
DOI: 10.1523/JNEUROSCI.2038-22.2023 -
Perception Mar 2024Aristotle believed that objects fell at a constant velocity. However, Galileo Galilei showed that when an object falls, gravity causes it to accelerate. Regardless,...
Aristotle believed that objects fell at a constant velocity. However, Galileo Galilei showed that when an object falls, gravity causes it to accelerate. Regardless, Aristotle's claim raises the possibility that people's visual perception of falling motion might be biased away from acceleration towards constant velocity. We tested this idea by requiring participants to judge whether a ball moving in a simulated naturalistic setting appeared to accelerate or decelerate as a function of its motion direction and the amount of acceleration/deceleration. We found that the point of subjective constant velocity (PSCV) differed between up and down but not between left and right motion directions. The PSCV difference between up and down indicated that more acceleration was needed for a downward-falling object to appear at constant velocity than for an upward "falling" object. We found no significant differences in sensitivity to acceleration for the different motion directions. Generalized linear mixed modeling determined that participants relied predominantly on acceleration when making these judgments. Our results support the idea that Aristotle's belief may in part be due to a bias that reduces the perceived magnitude of acceleration for falling objects, a bias not revealed in previous studies of the perception of visual motion.
Topics: Humans; Motion Perception; Acceleration; Visual Perception; Gravitation
PubMed: 38304970
DOI: 10.1177/03010066241228681 -
Journal of Vision Sep 2021Evidences of perceptual changes that accompany motor activity have been limited primarily to audition and somatosensation. Here we asked whether motor learning results...
Evidences of perceptual changes that accompany motor activity have been limited primarily to audition and somatosensation. Here we asked whether motor learning results in changes to visual motion perception. We designed a reaching task in which participants were trained to make movements along several directions, while the visual feedback was provided by an intrinsically ambiguous moving stimulus directly tied to hand motion. We find that training improves coherent motion perception and that changes in movement are correlated with perceptual changes. No perceptual changes are observed in passive training even when observers were provided with an explicit strategy to facilitate single motion perception. A Bayesian model suggests that movement training promotes the fine-tuning of the internal representation of stimulus geometry. These results emphasize the role of sensorimotor interaction in determining the persistent properties in space and time that define a percept.
Topics: Bayes Theorem; Hand; Humans; Motion; Motion Perception; Visual Perception
PubMed: 34529006
DOI: 10.1167/jov.21.10.13 -
Proceedings of the National Academy of... Jul 2021Recurrent loops in the visual cortex play a critical role in visual perception, which is likely not mediated by purely feed-forward pathways. However, the development of...
Recurrent loops in the visual cortex play a critical role in visual perception, which is likely not mediated by purely feed-forward pathways. However, the development of recurrent loops is poorly understood. The role of recurrent processing has been studied using visual backward masking, a perceptual phenomenon in which a visual stimulus is rendered invisible by a following mask, possibly because of the disruption of recurrent processing. Anatomical studies have reported that recurrent pathways are immature in early infancy. This raises the possibility that younger infants process visual information mainly in a feed-forward manner, and thus, they might be able to perceive visual stimuli that adults cannot see because of backward masking. Here, we show that infants under 7 mo of age are immune to visual backward masking and that masked stimuli remain visible to younger infants while older infants cannot perceive them. These results suggest that recurrent processing is immature in infants under 7 mo and that they are able to perceive objects even without recurrent processing. Our findings indicate that the algorithm for visual perception drastically changes in the second half of the first year of life.
Topics: Facial Recognition; Female; Form Perception; Humans; Infant; Male; Perceptual Masking; Photic Stimulation; Reproducibility of Results; Visual Perception
PubMed: 34162737
DOI: 10.1073/pnas.2103040118 -
Journal of Experimental Child Psychology Jul 2024Perceiving motion in depth is important in everyday life, especially motion in relation to the body. Visual and auditory cues inform us about motion in space when...
Perceiving motion in depth is important in everyday life, especially motion in relation to the body. Visual and auditory cues inform us about motion in space when presented in isolation from each other, but the most comprehensive information is obtained through the combination of both of these cues. We traced the development of infants' ability to discriminate between visual motion trajectories across peripersonal space and to match these with auditory cues specifying the same peripersonal motion. We measured 5-month-old (n = 20) and 9-month-old (n = 20) infants' visual preferences for visual motion toward or away from their body (presented simultaneously and side by side) across three conditions: (a) visual displays presented alone, (b) paired with a sound increasing in intensity, and (c) paired with a sound decreasing in intensity. Both groups preferred approaching motion in the visual-only condition. When the visual displays were paired with a sound increasing in intensity, neither group showed a visual preference. When a sound decreasing in intensity was played instead, the 5-month-olds preferred the receding (spatiotemporally congruent) visual stimulus, whereas the 9-month-olds preferred the approaching (spatiotemporally incongruent) visual stimulus. We speculate that in the approaching sound condition, the behavioral salience of the sound could have led infants to focus on the auditory information alone, in order to prepare a motor response, and to neglect the visual stimuli. In the receding sound condition, instead, the difference in response patterns in the two groups may have been driven by infants' emerging motor abilities and their developing predictive processing mechanisms supporting and influencing each other.
Topics: Humans; Infant; Female; Male; Motion Perception; Auditory Perception; Cues; Child Development; Visual Perception; Depth Perception; Acoustic Stimulation
PubMed: 38615600
DOI: 10.1016/j.jecp.2024.105921