-
Attention, Perception & Psychophysics Apr 2022Stimulus statistics can induce expectations that in turn can influence multisensory perception. In three experiments, we manipulate perceptual history by biasing...
Stimulus statistics can induce expectations that in turn can influence multisensory perception. In three experiments, we manipulate perceptual history by biasing stimulus statistics and examined the effect of implicit expectations on the perceptual resolution of a bistable visual stimulus that is modulated by sound. First, we found a general effect of expectation such that responses were biased in line with the biased statistics and interpret this as a bias towards an implicitly expected outcome. Second, expectation did not influence the perception of all types of stimuli. In both Experiment 1 and Experiment 2, integrated audio-visual stimuli were affected by expectation but visual-only and unintegrated audio-visual stimuli were not. In Experiment 3 we examined the sensory versus interpretational effects of expectation and found that contrary to our predictions, an expectation of audio-visually integrated stimuli was associated with impaired multisensory integration compared to visual-only or unintegrated audio-visual stimuli. Our findings suggest that perceptual experience implicitly creates expectations that influence multisensory perception, which appear to be about perceptual outcomes rather than sensory stimuli. Finally, in the case of resolving perceptual ambiguity, the expectation effect is an effect on cognitive rather than sensory processes.
Topics: Acoustic Stimulation; Auditory Perception; Humans; Motivation; Photic Stimulation; Sound; Visual Perception
PubMed: 35233744
DOI: 10.3758/s13414-022-02460-z -
Journal of Experimental Psychology.... Jun 2023We investigated temporal properties of visual perception as a function of eccentricity, that is, spatial position relative to the fovea. Our experiments were motivated...
We investigated temporal properties of visual perception as a function of eccentricity, that is, spatial position relative to the fovea. Our experiments were motivated by well-characterized non-uniformities in neuron distribution in the human eye and early visual pathways. These non-uniformities have been extensively studied in the context of spatial perception, while largely neglected in relation to temporal perception. In Experiment 1, participants fixated the rapid serial visual presentation letter stream and were instructed to report the letter which appeared simultaneously with a brief cue presented at different locations along the horizontal meridian. Participants exhibited a tendency to report earlier letters with more peripheral as compared to central cues, indicating that they misperceived differently located stimuli as simultaneous even though they were never presented together. Experiment 2 conceptually replicated the findings of Experiment 1. Experiment 3 further demonstrated that the effect is specifically due to eccentricity, and not the relative distance between the stimuli. We argue that such location-based misperceptions of simultaneity arise because transient stimuli at more eccentric locations advance to perception faster than stimuli at or near the fovea. Collectively, these experiments show, for the first time, how processing speed differences across the visual field translate into differences in perceived simultaneity. They also demonstrate, for the first time, location-based misperceptions of simultaneity for stimuli never presented together. Finally, Experiment 4 showed that greater eccentricity also increased the perceived duration of a stimulus compared to fovea. These results reveal the breadth of perceptual effects driven by temporal processing differences across the visual field. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Topics: Humans; Visual Perception; Vision, Ocular; Visual Fields; Space Perception; Time Perception
PubMed: 36701526
DOI: 10.1037/xge0001352 -
Annual Review of Neuroscience Jul 2023Despite increasing evidence of its involvement in several key functions of the cerebral cortex, the vestibular sense rarely enters our consciousness. Indeed, the extent... (Review)
Review
Despite increasing evidence of its involvement in several key functions of the cerebral cortex, the vestibular sense rarely enters our consciousness. Indeed, the extent to which these internal signals are incorporated within cortical sensory representation and how they might be relied upon for sensory-driven decision-making, during, for example, spatial navigation, is yet to be understood. Recent novel experimental approaches in rodents have probed both the physiological and behavioral significance of vestibular signals and indicate that their widespread integration with vision improves both the cortical representation and perceptual accuracy of self-motion and orientation. Here, we summarize these recent findings with a focus on cortical circuits involved in visual perception and spatial navigation and highlight the major remaining knowledge gaps. We suggest that vestibulo-visual integration reflects a process of constant updating regarding the status of self-motion, and access to such information by the cortex is used for sensory perception and predictions that may be implemented for rapid, navigation-related decision-making.
Topics: Motion Perception; Cues; Visual Perception; Vestibule, Labyrinth; Cerebral Cortex
PubMed: 37428601
DOI: 10.1146/annurev-neuro-120722-100503 -
PloS One 2020Social cognition is dependent on the ability to extract information from human stimuli. Of those, patterns of biological motion (BM) and in particular walking patterns...
Social cognition is dependent on the ability to extract information from human stimuli. Of those, patterns of biological motion (BM) and in particular walking patterns of other humans, are prime examples. Although most often tested in isolation, BM outside the laboratory is often associated with multisensory cues (i.e. we often hear and see someone walking) and there is evidence that vision-based judgments of BM stimuli are systematically influenced by motor signals. Furthermore, cross-modal visuo-tactile mechanisms have been shown to influence perception of bodily stimuli. Based on these observations, we here investigated if somatosensory inputs would affect visual BM perception. In two experiments, we asked healthy participants to perform a speed discrimination task on two point light walkers (PLW) presented one after the other. In the first experiment, we quantified somatosensory-visual interactions by presenting PLW together with tactile stimuli either on the participants' forearms or feet soles. In the second experiment, we assessed the specificity of these interactions by presenting tactile stimuli either synchronously or asynchronously with upright or inverted PLW. Our results confirm that somatosensory input in the form of tactile foot stimulation influences visual BM perception. When presented with a seen walker's footsteps, additional tactile cues enhanced sensitivity on a speed discrimination task, but only if the tactile stimuli were presented on the relevant body-part (under the feet) and when the tactile stimuli were presented synchronously with the seen footsteps of the PLW, whether upright or inverted. Based on these findings we discuss potential mechanisms of somatosensory-visual interactions in BM perception.
Topics: Adult; Female; Humans; Judgment; Male; Motion Perception; Photic Stimulation; Physical Stimulation; Touch Perception; Visual Perception; Young Adult
PubMed: 32525897
DOI: 10.1371/journal.pone.0234026 -
Vision Research Aug 2023The visual system involves various orientation and visual field anisotropies, one of which is a preference for radial orientations and motion directions. By radial, we...
The visual system involves various orientation and visual field anisotropies, one of which is a preference for radial orientations and motion directions. By radial, we mean those directions coursing symmetrically outward from the fovea into the periphery. This bias stems from anatomical and physiological substrates in the early visual system. We recently reported that this low-level visual anisotropy can alter perceived object orientation. Here, we report that radial bias can also alter another higher-level system, the perceived direction of apparent motion. We presented a bistable apparent motion quartet in the center of the screen while participants fixated on various locations around the quartet. Participants (N = 22) were strongly biased to see the motion direction that was radial with respect to their fixation, controlling for any biases with center fixation. This was observed using a vertical-horizontal quartet as well as an oblique quartet (45° rotated quartet). The latter allowed us to rule out the contribution of the hemisphere effect where motion across the midline is perceived less often. These results extend our earlier findings on perceived object orientation, showing that low-level structural aspects of the visual system alter yet another higher-level visual process, that of apparent motion perception.
Topics: Humans; Motion Perception; Bias; Visual Fields; Anisotropy; Motion; Visual Perception
PubMed: 37149959
DOI: 10.1016/j.visres.2023.108246 -
Nature Human Behaviour Sep 2021Human visual perception carves a scene at its physical joints, decomposing the world into objects, which are selectively attended, tracked and predicted as we engage our... (Review)
Review
Human visual perception carves a scene at its physical joints, decomposing the world into objects, which are selectively attended, tracked and predicted as we engage our surroundings. Object representations emancipate perception from the sensory input, enabling us to keep in mind that which is out of sight and to use perceptual content as a basis for action and symbolic cognition. Human behavioural studies have documented how object representations emerge through grouping, amodal completion, proto-objects and object files. By contrast, deep neural network models of visual object recognition remain largely tethered to sensory input, despite achieving human-level performance at labelling objects. Here, we review related work in both fields and examine how these fields can help each other. The cognitive literature provides a starting point for the development of new experimental tasks that reveal mechanisms of human object perception and serve as benchmarks driving the development of deep neural network models that will put the object into object recognition.
Topics: Humans; Neural Networks, Computer; Pattern Recognition, Visual; Recognition, Psychology; Visual Pathways; Visual Perception
PubMed: 34545237
DOI: 10.1038/s41562-021-01194-6 -
Vision Research Mar 2023Optic flow is an important visual cue for human perception and locomotion and naturally triggers eye movements. Here we investigate whether the perception of optic flow...
Optic flow is an important visual cue for human perception and locomotion and naturally triggers eye movements. Here we investigate whether the perception of optic flow direction is limited or enhanced by eye movements. In Exp. 1, 23 human observers localized the focus of expansion (FOE) of an optic flow pattern; in Exp. 2, 18 observers had to detect brief visual changes at the FOE. Both tasks were completed during free viewing and fixation conditions while eye movements were recorded. Task difficulty was varied by manipulating the coherence of radial motion from the FOE (4 %-90 %). During free viewing, observers tracked the optic flow pattern with a combination of saccades and smooth eye movements. During fixation, observers nevertheless made small-scale eye movements. Despite differences in spatial scale, eye movements during free viewing and fixation were similarly directed toward the FOE (saccades) and away from the FOE (smooth tracking). Whereas FOE localization sensitivity was not affected by eye movement instructions (Exp. 1), observers' sensitivity to detect brief changes at the FOE was 27 % higher (p <.001) during free-viewing compared to fixation (Exp. 2). This performance benefit was linked to reduced saccade endpoint errors, indicating the direct beneficial impact of foveating eye movements on performance in a fine-grain perceptual task, but not during coarse perceptual localization.
Topics: Humans; Eye Movements; Optic Flow; Saccades; Motion; Photic Stimulation; Fixation, Ocular; Visual Perception; Motion Perception
PubMed: 36566560
DOI: 10.1016/j.visres.2022.108164 -
Cortex; a Journal Devoted To the Study... Jul 2021Memory research has identified many strategies to enhance memory. However, natural foundations of enhanced memory are vastly underexplored. Interestingly, numerous...
Memory research has identified many strategies to enhance memory. However, natural foundations of enhanced memory are vastly underexplored. Interestingly, numerous studies show that synesthesia is associated with enhanced memory performance. Although it has been hypothesized for years that wider changes in visual perception are closely linked with enhanced memory functions in synesthesia, the hypothesis has never been directly put to the test. Here, we investigated whether visual perceptual abilities in synesthesia are linked with higher memory performance by comparing synesthetes who experience colors for letters with non-synesthetic color experts and non-synesthetic individuals from the more general population. Our results showed that synesthesia and expertise share a common profile of enhanced visual perceptual ability and memory in contrast to non-synesthetic individuals from the more general population. Overall, our findings suggest that visual perception and visual memory are more closely connected than previously thought.
Topics: Color Perception; Humans; Memory, Short-Term; Pattern Recognition, Visual; Perceptual Disorders; Photic Stimulation; Synesthesia; Visual Perception
PubMed: 33905967
DOI: 10.1016/j.cortex.2021.01.024 -
Sensors (Basel, Switzerland) Aug 2022Visual prostheses, used to assist in restoring functional vision to the visually impaired, convert captured external images into corresponding electrical stimulation... (Review)
Review
Visual prostheses, used to assist in restoring functional vision to the visually impaired, convert captured external images into corresponding electrical stimulation patterns that are stimulated by implanted microelectrodes to induce phosphenes and eventually visual perception. Detecting and providing useful visual information to the prosthesis wearer under limited artificial vision has been an important concern in the field of visual prosthesis. Along with the development of prosthetic device design and stimulus encoding methods, researchers have explored the possibility of the application of computer vision by simulating visual perception under prosthetic vision. Effective image processing in computer vision is performed to optimize artificial visual information and improve the ability to restore various important visual functions in implant recipients, allowing them to better achieve their daily demands. This paper first reviews the recent clinical implantation of different types of visual prostheses, summarizes the artificial visual perception of implant recipients, and especially focuses on its irregularities, such as dropout and distorted phosphenes. Then, the important aspects of computer vision in the optimization of visual information processing are reviewed, and the possibilities and shortcomings of these solutions are discussed. Ultimately, the development direction and emphasis issues for improving the performance of visual prosthesis devices are summarized.
Topics: Image Processing, Computer-Assisted; Phosphenes; Vision, Ocular; Visual Perception; Visual Prosthesis
PubMed: 36081002
DOI: 10.3390/s22176544 -
Journal of Vision Apr 2024We obtain large amounts of external information through our eyes, a process often considered analogous to picture mapping onto a camera lens. However, our eyes are never...
We obtain large amounts of external information through our eyes, a process often considered analogous to picture mapping onto a camera lens. However, our eyes are never as still as a camera lens, with saccades occurring between fixations and microsaccades occurring within a fixation. Although saccades are agreed to be functional for information sampling in visual perception, it remains unknown if microsaccades have a similar function when eye movement is restricted. Here, we demonstrated that saccades and microsaccades share common spatiotemporal structures in viewing visual objects. Twenty-seven adults viewed faces and houses in free-viewing and fixation-controlled conditions. Both saccades and microsaccades showed distinctive spatiotemporal patterns between face and house viewing that could be discriminated by pattern classifications. The classifications based on saccades and microsaccades could also be mutually generalized. Importantly, individuals who showed more distinctive saccadic patterns between faces and houses also showed more distinctive microsaccadic patterns. Moreover, saccades and microsaccades showed a higher structure similarity for face viewing than house viewing and a common orienting preference for the eye region over the mouth region. These findings suggested a common oculomotor program that is used to optimize information sampling during visual object perception.
Topics: Humans; Saccades; Male; Female; Adult; Fixation, Ocular; Young Adult; Visual Perception; Photic Stimulation; Pattern Recognition, Visual
PubMed: 38656530
DOI: 10.1167/jov.24.4.20