-
Attention, Perception & Psychophysics May 2023Optic flow, the pattern of light generated in the visual field by motion of objects and the observer's body, serves as information that underwrites perception of events,...
Optic flow, the pattern of light generated in the visual field by motion of objects and the observer's body, serves as information that underwrites perception of events, actions, and affordances. This visual pattern informs the observer about their own actions in relation to their surroundings, as well as those of others. This study explored the limits of action detection for others as well as the role of optic flow. First-person videos were created using camera recordings of the actor's perspective as they performed various movements (jumping jacks, jumping, squatting, sitting, etc.). In three experiments participants attempted to detect the action from first-person video footage using open ended responses (Experiment 1), forced-choice responses (Experiment 2), and a match-to-sample paradigm (Experiment 3). It was discovered that some actions are more difficult to detect than others. When the task was challenging (Experiment 1) athletes were more accurate, but this was not the case in Experiments 2 and 3. All actions were identified above chance level across viewpoints, suggesting that invariant information was detected and used to perform the task.
Topics: Humans; Visual Perception; Optic Flow; Movement; Posture; Visual Fields; Motion Perception
PubMed: 36918506
DOI: 10.3758/s13414-023-02674-9 -
Proceedings. Biological Sciences May 2002We sought to determine the extent to which red-green, colour-opponent mechanisms in the human visual system play a role in the perception of drifting luminance-modulated...
We sought to determine the extent to which red-green, colour-opponent mechanisms in the human visual system play a role in the perception of drifting luminance-modulated targets. Contrast sensitivity for the directional discrimination of drifting luminance-modulated (yellow-black) test sinusoids was measured following adaptation to isoluminant red-green sinusoids drifting in either the same or opposite direction. When the test and adapt stimuli drifted in the same direction, large sensitivity losses were evident at all test temporal frequencies employed (1-16 Hz). The magnitude of the loss was independent of temporal frequency. When adapt and test stimuli drifted in opposing directions, large sensitivity losses were evident at lower temporal frequencies (1-4 Hz) and declined with increasing temporal frequency. Control studies showed that this temporal-frequency-dependent effect could not reflect the activity of achromatic units. Our results provide evidence that chromatic mechanisms contribute to the perception of luminance-modulated motion targets drifting at speeds of up to at least 32 degrees s(-1). We argue that such mechanisms most probably lie within a parvocellular-dominated cortical visual pathway, sensitive to both chromatic and luminance modulation, but only weakly selective for the direction of stimulus motion.
Topics: Adaptation, Ocular; Color Perception; Contrast Sensitivity; Female; Humans; Light; Male; Motion Perception; Photic Stimulation; Visual Perception
PubMed: 12028757
DOI: 10.1098/rspb.2002.1985 -
BMC Biology May 2024Threat and individual differences in threat-processing bias perception of stimuli in the environment. Yet, their effect on perception of one's own (body-based)...
BACKGROUND
Threat and individual differences in threat-processing bias perception of stimuli in the environment. Yet, their effect on perception of one's own (body-based) self-motion in space is unknown. Here, we tested the effects of threat on self-motion perception using a multisensory motion simulator with concurrent threatening or neutral auditory stimuli.
RESULTS
Strikingly, threat had opposite effects on vestibular and visual self-motion perception, leading to overestimation of vestibular, but underestimation of visual self-motions. Trait anxiety tended to be associated with an enhanced effect of threat on estimates of self-motion for both modalities.
CONCLUSIONS
Enhanced vestibular perception under threat might stem from shared neural substrates with emotional processing, whereas diminished visual self-motion perception may indicate that a threatening stimulus diverts attention away from optic flow integration. Thus, threat induces modality-specific biases in everyday experiences of self-motion.
Topics: Humans; Motion Perception; Male; Female; Adult; Young Adult; Visual Perception; Fear; Anxiety; Acoustic Stimulation
PubMed: 38783286
DOI: 10.1186/s12915-024-01911-3 -
The Journal of Neuroscience : the... Nov 2014We use visual information to determine our dynamic relationship with other objects in a three-dimensional (3D) world. Despite decades of work on visual motion...
We use visual information to determine our dynamic relationship with other objects in a three-dimensional (3D) world. Despite decades of work on visual motion processing, it remains unclear how 3D directions-trajectories that include motion toward or away from the observer-are represented and processed in visual cortex. Area MT is heavily implicated in processing visual motion and depth, yet previous work has found little evidence for 3D direction sensitivity per se. Here we use a rich ensemble of binocular motion stimuli to reveal that most neurons in area MT of the anesthetized macaque encode 3D motion information. This tuning for 3D motion arises from multiple mechanisms, including different motion preferences in the two eyes and a nonlinear interaction of these signals when both eyes are stimulated. Using a novel method for functional binocular alignment, we were able to rule out contributions of static disparity tuning to the 3D motion tuning we observed. We propose that a primary function of MT is to encode 3D motion, critical for judging the movement of objects in dynamic real-world environments.
Topics: Animals; Macaca fascicularis; Male; Motion Perception; Photic Stimulation; Vision, Binocular; Vision, Monocular; Visual Cortex; Visual Pathways; Visual Perception
PubMed: 25411482
DOI: 10.1523/JNEUROSCI.1081-14.2014 -
Journal of Vision Dec 2021Visual perception is the result of a highly complex process depending on both stimulus and observer characteristics and, importantly, their interactions. Generating...
Visual perception is the result of a highly complex process depending on both stimulus and observer characteristics and, importantly, their interactions. Generating robust theories and making precise predictions in light of this complexity can be challenging, and the interaction of stimulus- and observer-related effects is often neglected or understated. In the current study, we examined inter- and intra-individual differences and the effects of a wide range of three stimulus characteristics (i.e., spatial distance, temporal distance, and spatial location). Our results indicate that not all individuals show the same group average stimulus-driven effects on the perception of a motion quartet and that these effects are not always equal across the entire stimulus range. Moreover, we observed that there are clear individual differences in spontaneous perceptual dynamics and that these can be overridden by some but not all stimulus manipulations. We conclude that considering different stimulus manipulations, different observers, and their interactions can provide a more nuanced and informative view on the processes governing visual perception. This study examines the effect of spatial distance, spatiotemporal distance, spatial location, and individual differences on the perception of the ambiguous motion quartet.
Topics: Humans; Motion; Motion Perception; Photic Stimulation; Visual Perception
PubMed: 34964859
DOI: 10.1167/jov.21.13.12 -
Journal of Vision Apr 2023Humans can use visual motion to estimate the distance they have traveled. In static environments, optic flow generated by self-motion provides a pattern of expanding...
Humans can use visual motion to estimate the distance they have traveled. In static environments, optic flow generated by self-motion provides a pattern of expanding motion that is used for the estimation of travel distance. When the environment is populated by other people, their biological motion destroys the one-to-on correspondence between optic flow and travel distance. We investigated how observers estimate travel distance in a crowded environment. In three conditions, we simulated self-motion through a crowd of standing, approaching, or leading point-light walkers. For a standing crowd, optic flow is a veridical signal for distance perception. For an approaching crowd, the visual motion is the sum of the self-motion-induced optic flow and the optic flow produced by the approaching walkers. If only optic flow were to be used, travel distance estimates would be too high because of the approaching direction of the crowd toward the observer. If, on the other hand, cues from biological motion could be used to estimate the speed of the crowd, then the excessive optic from the approaching crowd flow might be compensated. In the leading crowd condition, in which walkers of the crowd keep their distance from the observer as they walk along with the observer, no optic flow is produced. In this condition, travel distance estimation would have to rely solely on biological motion information. We found that distance estimation was quite similar across these three conditions. This suggests that biological motion information can be used (a) to compensate for excessive optic flow in the approaching crowd condition and (b) to generate distance information in the leading crowd condition.
Topics: Humans; Motion Perception; Visual Perception; Distance Perception; Walking; Optic Flow
PubMed: 37099279
DOI: 10.1167/jov.23.4.7 -
PloS One 2018In 1957, Craig Mooney published a set of human face stimuli to study perceptual closure: the formation of a coherent percept on the basis of minimal visual information....
In 1957, Craig Mooney published a set of human face stimuli to study perceptual closure: the formation of a coherent percept on the basis of minimal visual information. Images of this type, now known as "Mooney faces", are widely used in cognitive psychology and neuroscience because they offer a means of inducing variable perception with constant visuo-spatial characteristics (they are often not perceived as faces if viewed upside down). Mooney's original set of 40 stimuli has been employed in several studies. However, it is often necessary to use a much larger stimulus set. We created a new set of over 500 Mooney faces and tested them on a cohort of human observers. We present the results of our tests here, and make the stimuli freely available via the internet. Our test results can be used to select subsets of the stimuli that are most suited for a given experimental purpose.
Topics: Adolescent; Adult; Cohort Studies; Facial Recognition; Female; Humans; Male; Middle Aged; Pattern Recognition, Visual; Photic Stimulation; Reaction Time; Visual Perception; Young Adult
PubMed: 29979727
DOI: 10.1371/journal.pone.0200106 -
Sensors (Basel, Switzerland) Nov 2022A high dynamic range (HDR) stereoscopic omnidirectional vision system can provide users with more realistic binocular and immersive perception, where the HDR...
A high dynamic range (HDR) stereoscopic omnidirectional vision system can provide users with more realistic binocular and immersive perception, where the HDR stereoscopic omnidirectional image (HSOI) suffers distortions during its encoding and visualization, making its quality evaluation more challenging. To solve the problem, this paper proposes a client-oriented blind HSOI quality metric based on visual perception. The proposed metric mainly consists of a monocular perception module (MPM) and binocular perception module (BPM), which combine monocular/binocular, omnidirectional and HDR/tone-mapping perception. The MPM extracts features from three aspects: global color distortion, symmetric/asymmetric distortion and scene distortion. In the BPM, the binocular fusion map and binocular difference map are generated by joint image filtering. Then, brightness segmentation is performed on the binocular fusion image, and distinctive features are extracted on the segmented high/low/middle brightness regions. For the binocular difference map, natural scene statistical features are extracted by multi-coefficient derivative maps. Finally, feature screening is used to remove the redundancy between the extracted features. Experimental results on the HSOID database show that the proposed metric is generally better than the representative quality metric, and is more consistent with the subjective perception.
Topics: Humans; Vision, Binocular; Depth Perception; Visual Perception
PubMed: 36366211
DOI: 10.3390/s22218513 -
Current Biology : CB Mar 2022Recent research has uncovered a surprising new role of colour in the perception of three-dimensional shape. The brain is exquisitely sensitive to visual patterns...
Recent research has uncovered a surprising new role of colour in the perception of three-dimensional shape. The brain is exquisitely sensitive to visual patterns emerging from the way different wavelengths interact with surfaces.
Topics: Brain; Color Perception; Visual Perception
PubMed: 35349812
DOI: 10.1016/j.cub.2022.01.077 -
Current Biology : CB Dec 2022Eye movements cause rapid motion of the retinal image, potentially confusable with external motion. A recent study shows that neurons in mouse primary visual cortex...
Eye movements cause rapid motion of the retinal image, potentially confusable with external motion. A recent study shows that neurons in mouse primary visual cortex distinguish self-generated from external motion by combining sensory input with saccade-related signals from the thalamic pulvinar nucleus.
Topics: Animals; Mice; Eye Movements; Saccades; Neurons; Perception; Motion Perception; Photic Stimulation; Visual Perception
PubMed: 36538882
DOI: 10.1016/j.cub.2022.11.003