-
Human Movement Science Oct 2019The perception of distance in open fields was widely studied with static observers. However, it is a fact that we and the world around us are in continuous relative...
The perception of distance in open fields was widely studied with static observers. However, it is a fact that we and the world around us are in continuous relative movement, and that our perceptual experience is shaped by the complex interactions between our senses and the perception of our self-motion. This poses interesting questions about how our nervous system integrates this multisensory information to resolve specific tasks of our daily life, for example, distance estimation. This study provides new evidence about how visual and motor self-motion information affects our perception of distance and a hypothesis about how these two sources of information can be integrated to calibrate the estimation of distance. This model accounts for the biases found when visual and proprioceptive information is inconsistent.
Topics: Adult; Distance Perception; Female; Humans; Male; Motion Perception; Movement; Photic Stimulation; Proprioception; Visual Perception; Young Adult
PubMed: 31301557
DOI: 10.1016/j.humov.2019.102496 -
Attention, Perception & Psychophysics Apr 2021Multiple-object tracking studies consistently reveal attentive tracking limits of approximately three to five items. How do factors such as visual grouping and ensemble...
Multiple-object tracking studies consistently reveal attentive tracking limits of approximately three to five items. How do factors such as visual grouping and ensemble perception impact these capacity limits? Which heuristics lead to the perception of multiple objects as a group? This work investigates the role of grouping on multiple-object tracking ability, and more specifically, in identifying the heuristics that lead to the formation and perception of ensembles within dynamic contexts. First, we show that group tracking limits are approximately four groups of objects and are independent of the number of items that compose the groups. Further, we show that group tracking performance declines as inter-object spacing increases. We also demonstrate the role of group rigidity in tracking performance in that disruptions to common fate negatively impact ensemble tracking ability. The findings from this work contribute to our overall understanding of the perception of dynamic groups of objects. They characterize the properties that determine the formation and perception of dynamic object ensembles. In addition, they inform development and design decisions considering cognitive limitations involving tracking groups of objects.
Topics: Attention; Humans; Motion Perception; Perception; Space Perception; Visual Perception
PubMed: 33409901
DOI: 10.3758/s13414-020-02219-4 -
The Journal of Neuroscience : the... May 2023From moment to moment, the visual properties of objects in the world fluctuate because of external factors like ambient lighting, occlusion and eye movements, and...
From moment to moment, the visual properties of objects in the world fluctuate because of external factors like ambient lighting, occlusion and eye movements, and internal (proximal) noise. Despite this variability in the incoming information, our perception is stable. Serial dependence, the behavioral attraction of current perceptual responses toward previously seen stimuli, may reveal a mechanism underlying stability: a spatiotemporally tuned operator that smooths over spurious fluctuations. The current study examined the neural underpinnings of serial dependence by recording the electroencephalographic (EEG) brain response of female and male human observers to prototypical objects (faces, cars, and houses) and morphs that mixed properties of two prototypes. Behavior was biased toward previously seen objects. Representational similarity analysis (RSA) revealed that responses evoked by visual objects contained information about the previous stimulus. The trace of previous representations in the response to the current object occurred immediately on object appearance, suggesting that serial dependence arises from a brain state or set that precedes processing of new input. However, the brain response to current visual objects was not representationally similar to the trace they leave on subsequent object representations. These results reveal that while past stimulus history influences current representations, this influence does not imply a shared neural code between the previous trial (memory) and the current trial (perception). The perception of visual objects is pulled toward instances of that object seen in the recent past. The neural underpinnings of this serial dependence remain to be fully investigated. The present study examined electroencephalographic (EEG) responses to faces, cars, and houses, and ambiguous between-category morphs. With representational similarity analysis (RSA), we showed (1) object-specific neural patterns that differentiate the three categories; (2) that the response to the current object contains information about the previous object, mirroring behavioral serial dependence; (3) that the object-specific neural pattern about the past was different from that in the current response, revealing that while past stimulus history influences current representations, this does not imply a shared neural code between the previous trial (memory) and the current trial (perception).
Topics: Male; Humans; Female; Visual Perception; Brain; Eye Movements; Brain Mapping; Electroencephalography; Pattern Recognition, Visual; Photic Stimulation
PubMed: 36944487
DOI: 10.1523/JNEUROSCI.2068-22.2023 -
Vision Research Aug 2019
Topics: Humans; Psychophysics; Reading; Visual Perception
PubMed: 31194983
DOI: 10.1016/j.visres.2019.06.002 -
Neuron Oct 2022Substantial experimental, theoretical, and computational insights into sensory processing have been derived from the phenomena of perceptual multistability-when two or... (Review)
Review
Substantial experimental, theoretical, and computational insights into sensory processing have been derived from the phenomena of perceptual multistability-when two or more percepts alternate or switch in response to a single sensory input. Here, we review a range of findings suggesting that alternations can be seen as internal choices by the brain responding to values. We discuss how elements of external, experimenter-controlled values and internal, uncertainty- and aesthetics-dependent values influence multistability. We then consider the implications for the involvement in switching of regions, such as the anterior cingulate cortex, which are more conventionally tied to value-dependent operations such as cognitive control and foraging.
Topics: Brain; Uncertainty; Vision, Binocular; Visual Perception
PubMed: 36041434
DOI: 10.1016/j.neuron.2022.07.024 -
NeuroImage Oct 2022Research on face perception has revealed highly specialized visual mechanisms such as configural processing, and provided markers of interindividual differences...
Research on face perception has revealed highly specialized visual mechanisms such as configural processing, and provided markers of interindividual differences -including disease risks and alterations- in visuo-perceptual abilities that traffic in social cognition. Is face perception unique in degree or kind of mechanisms, and in its relevance for social cognition? Combining functional MRI and behavioral methods, we address the processing of an uncharted class of socially relevant stimuli: minimal social scenes involving configurations of two bodies spatially close and face-to-face as if interacting (hereafter, facing dyads). We report category-specific activity for facing (vs. non-facing) dyads in visual cortex. That activity shows face-like signatures of configural processing -i.e., stronger response to facing (vs. non-facing) dyads, and greater susceptibility to stimulus inversion for facing (vs. non-facing) dyads-, and is predicted by performance-based measures of configural processing in visual perception of body dyads. Moreover, we observe that the individual performance in body-dyad perception is reliable, stable-over-time and correlated with the individual social sensitivity, coarsely captured by the Autism-Spectrum Quotient. Further analyses clarify the relationship between single-body and body-dyad perception. We propose that facing dyads are processed through highly specialized mechanisms -and brain areas-, analogously to other biologically and socially relevant stimuli such as faces. Like face perception, facing-dyad perception can reveal basic (visual) processes that lay the foundations for understanding others, their relationships and interactions.
Topics: Brain; Facial Recognition; Humans; Pattern Recognition, Visual; Social Perception; Visual Cortex; Visual Perception
PubMed: 35878724
DOI: 10.1016/j.neuroimage.2022.119506 -
Cortex; a Journal Devoted To the Study... Nov 2020Human vision serves the social function of detecting and discriminating with high efficiency conspecifics and other animals. The social world is made of social entities...
Human vision serves the social function of detecting and discriminating with high efficiency conspecifics and other animals. The social world is made of social entities as much as the relations between those entities. Recent work demonstrates that vision encodes visuo-spatial relations between bodies with the same efficiency and high specialization of face/body perception. Specifically, perception of face-to-face (vs. non-facing) bodies evokes effects compatible with the most robust markers of face-specificity such as the behavioral inversion effect and increased activity in selective visual areas. Another set of results suggests that face-to-face bodies are processed as a grouped unit, analogously to facial features in a face. The facing dyad in the visual cortex may be the earliest rudimentary representation of social interaction.
Topics: Animals; Facial Recognition; Humans; Pattern Recognition, Visual; Social Perception; Visual Cortex; Visual Perception
PubMed: 32698947
DOI: 10.1016/j.cortex.2020.06.005 -
Philosophical Transactions of the Royal... Sep 2023To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a... (Review)
Review
To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a fundamentally multisensory process involving integration of visual, vestibular and kinesthetic inputs. Ideal observer models, paired with careful neurophysiological investigation, helped to reveal how visual and vestibular signals are combined to support perception of linear self-motion direction, or heading. Recent work has extended these findings by emphasizing the dimension of time, both with regard to stimulus dynamics and the trade-off between speed and accuracy. Both time and certainty-i.e. the degree of confidence in a multisensory decision-are essential to the ecological goals of the system: terminating a decision process is necessary for timely action, and predicting one's accuracy is critical for making multiple decisions in a sequence, as in navigation. Here, we summarize a leading model for multisensory decision-making, then show how the model can be extended to study confidence in heading discrimination. Lastly, we preview ongoing efforts to bridge self-motion perception and navigation , including closed-loop virtual reality and active self-motion. The design of unconstrained, ethologically inspired tasks, accompanied by large-scale neural recordings, raise promise for a deeper understanding of spatial perception and decision-making in the behaving animal. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Topics: Animals; Motion Perception; Space Perception; Vestibule, Labyrinth; Movement; Adaptation, Psychological; Visual Perception; Photic Stimulation
PubMed: 37545301
DOI: 10.1098/rstb.2022.0333 -
Current Opinion in Neurology Feb 2015It is increasingly recognized that affective values associated with visual stimuli can influence visual perception, attention, and eye movements. Recent research has... (Review)
Review
PURPOSE OF REVIEW
It is increasingly recognized that affective values associated with visual stimuli can influence visual perception, attention, and eye movements. Recent research has begun to uncover the brain mechanisms mediating these phenomena. The present review summarizes the main paradigms and findings demonstrating emotional and motivational influences on visual processing.
RECENT FINDINGS
Several pathways have been identified for enhancing neural responses of cortical visual areas to stimuli with intrinsic emotional value (e.g., facial expressions, social scenes, and others), including projections from the amygdala and ascending modulatory neurotransmitter systems from the brainstem. These pathways can guide attention and gaze to emotionally salient information with either negative (threatening) or positive (rewarding) associations. In addition, abundant research in recent years suggests that probabilistic reward learning can lead to powerful biases in visual attention and saccade control through subcortical pathways connecting visual areas with basal ganglia and superior colliculus. Time-resolved neuroimaging using electroencephalography or magnetoencephalography has begun to tackle the time course of these effects, and can now be complemented by neuroimaging and neurophysiology recordings in monkey.
SUMMARY
These findings have implications for understanding and assessing affective biases in perception and attention in patients with psychiatric disorders, such as phobias, depression, and addiction, but also open new avenues for rehabilitation in neurological patients with attention disorders.
Topics: Affect; Attention; Humans; Motivation; Reward; Vision, Binocular; Visual Perception
PubMed: 25490197
DOI: 10.1097/WCO.0000000000000159 -
Visual Neuroscience Jan 2015A basic principle in visual neuroscience is the retinotopic organization of neural receptive fields. Here, we review behavioral, neurophysiological, and neuroimaging... (Review)
Review
A basic principle in visual neuroscience is the retinotopic organization of neural receptive fields. Here, we review behavioral, neurophysiological, and neuroimaging evidence for nonretinotopic processing of visual stimuli. A number of behavioral studies have shown perception depending on object or external-space coordinate systems, in addition to retinal coordinates. Both single-cell neurophysiology and neuroimaging have provided evidence for the modulation of neural firing by gaze position and processing of visual information based on craniotopic or spatiotopic coordinates. Transient remapping of the spatial and temporal properties of neurons contingent on saccadic eye movements has been demonstrated in visual cortex, as well as frontal and parietal areas involved in saliency/priority maps, and is a good candidate to mediate some of the spatial invariance demonstrated by perception. Recent studies suggest that spatiotopic selectivity depends on a low spatial resolution system of maps that operates over a longer time frame than retinotopic processing and is strongly modulated by high-level cognitive factors such as attention. The interaction of an initial and rapid retinotopic processing stage, tied to new fixations, and a longer lasting but less precise nonretinotopic level of visual representation could underlie the perception of both a detailed and a stable visual world across saccadic eye movements.
Topics: Animals; Brain; Brain Mapping; Humans; Visual Fields; Visual Pathways; Visual Perception
PubMed: 26423219
DOI: 10.1017/S095252381500019X