-
Experimental Brain Research Oct 2022Various studies have demonstrated a role for cognition on self-motion perception. Those studies all concerned modulations of the perception of a physical or visual...
Various studies have demonstrated a role for cognition on self-motion perception. Those studies all concerned modulations of the perception of a physical or visual motion stimulus. In our study, however, we investigated whether cognitive cues could elicit a percept of oscillatory self-motion in the absence of sensory motion. If so, we could use this percept to investigate if the resulting mismatch between estimated self-motion and a lack of corresponding sensory signals is motion sickening. To that end, we seated blindfolded participants on a swing that remained motionless during two conditions, apart from a deliberate perturbation at the start of each condition. The conditions only differed regarding instructions, a secondary task and a demonstration, which suggested either a quick halt ("Distraction") or continuing oscillations of the swing ("Focus"). Participants reported that the swing oscillated with larger peak-to-peak displacements and for a longer period of time in the Focus condition. That increase was not reflected in the reported motion sickness scores, which did not differ between the two conditions. As the reported motion was rather small, the lack of an effect on the motion sickness response can be explained by assuming a subthreshold neural conflict. Our results support the existence of internal models relevant to sensorimotor processing and the potential of cognitive (behavioral) therapies to alleviate undesirable perceptual issues to some extent. We conclude that oscillatory self-motion can be perceived in the absence of related sensory stimulation, which advocates for the acknowledgement of cognitive cues in studies on self-motion perception.
Topics: Cues; Humans; Motion; Motion Perception; Motion Sickness; Self Concept; Visual Perception
PubMed: 35986767
DOI: 10.1007/s00221-022-06442-3 -
Journal of Neurophysiology Nov 2022Self-motion through an environment induces various sensory signals, i.e., visual, vestibular, auditory, or tactile. Numerous studies have investigated the role of visual...
Self-motion through an environment induces various sensory signals, i.e., visual, vestibular, auditory, or tactile. Numerous studies have investigated the role of visual and vestibular stimulation for the perception of self-motion direction (heading). Here, we investigated the rarely considered interaction of visual and tactile stimuli in heading perception. Participants were presented optic flow simulating forward self-motion across a horizontal ground plane (visual), airflow toward the participants' forehead (tactile), or both. In separate blocks of trials, participants indicated perceived heading from unimodal visual or tactile or bimodal sensory signals. In bimodal trials, presented headings were either spatially congruent or incongruent with a maximum offset between visual and tactile heading of 30°. To investigate the reference frame in which visuo-tactile heading is encoded, we varied head and eye orientation during presentation of the stimuli. Visual and tactile stimuli were designed to achieve comparable precision of heading reports between modalities. Nevertheless, in bimodal trials heading perception was dominated by the visual stimulus. A change of head orientation had no significant effect on perceived heading, whereas, surprisingly, a change in eye orientation affected tactile heading perception. Overall, we conclude that tactile flow is more important to heading perception than previously thought. We investigated heading perception from visual-only (optic flow), tactile-only (tactile flow), or bimodal self-motion stimuli in different conditions varying in head and eye position. Overall, heading perception was body or world centered and non-Bayes optimal and revealed a centripetal bias. Although being visually dominated, tactile flow revealed a significant influence during bimodal heading perception.
Topics: Humans; Motion Perception; Optic Flow; Vestibule, Labyrinth; Touch Perception; Touch; Photic Stimulation; Visual Perception
PubMed: 36259667
DOI: 10.1152/jn.00231.2022 -
Annual Review of Neuroscience Jul 2023How neurons detect the direction of motion is a prime example of neural computation: Motion vision is found in the visual systems of virtually all sighted animals, it is... (Review)
Review
How neurons detect the direction of motion is a prime example of neural computation: Motion vision is found in the visual systems of virtually all sighted animals, it is important for survival, and it requires interesting computations with well-defined linear and nonlinear processing steps-yet the whole process is of moderate complexity. The genetic methods available in the fruit fly and the charting of a connectome of its visual system have led to rapid progress and unprecedented detail in our understanding of how neurons compute the direction of motion in this organism. The picture that emerged incorporates not only the identity, morphology, and synaptic connectivity of each neuron involved but also its neurotransmitters, its receptors, and their subcellular localization. Together with the neurons' membrane potential responses to visual stimulation, this information provides the basis for a biophysically realistic model of the circuit that computes the direction of visual motion.
Topics: Animals; Motion Perception; Visual Pathways; Drosophila; Vision, Ocular; Neurons; Photic Stimulation
PubMed: 37428604
DOI: 10.1146/annurev-neuro-080422-111929 -
Philosophical Transactions of the Royal... Sep 2023To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a... (Review)
Review
To navigate and guide adaptive behaviour in a dynamic environment, animals must accurately estimate their own motion relative to the external world. This is a fundamentally multisensory process involving integration of visual, vestibular and kinesthetic inputs. Ideal observer models, paired with careful neurophysiological investigation, helped to reveal how visual and vestibular signals are combined to support perception of linear self-motion direction, or heading. Recent work has extended these findings by emphasizing the dimension of time, both with regard to stimulus dynamics and the trade-off between speed and accuracy. Both time and certainty-i.e. the degree of confidence in a multisensory decision-are essential to the ecological goals of the system: terminating a decision process is necessary for timely action, and predicting one's accuracy is critical for making multiple decisions in a sequence, as in navigation. Here, we summarize a leading model for multisensory decision-making, then show how the model can be extended to study confidence in heading discrimination. Lastly, we preview ongoing efforts to bridge self-motion perception and navigation , including closed-loop virtual reality and active self-motion. The design of unconstrained, ethologically inspired tasks, accompanied by large-scale neural recordings, raise promise for a deeper understanding of spatial perception and decision-making in the behaving animal. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Topics: Animals; Motion Perception; Space Perception; Vestibule, Labyrinth; Movement; Adaptation, Psychological; Visual Perception; Photic Stimulation
PubMed: 37545301
DOI: 10.1098/rstb.2022.0333 -
Frontiers in Neural Circuits 2018The ability of animals to detect motion is critical for survival, and errors or even delays in motion perception may prove costly. In the natural world, moving objects... (Review)
Review
The ability of animals to detect motion is critical for survival, and errors or even delays in motion perception may prove costly. In the natural world, moving objects in the visual field often produce concurrent sounds. Thus, it can highly advantageous to detect motion elicited from sensory signals of either modality, and to integrate them to produce more reliable motion perception. A great deal of progress has been made in understanding how visual motion perception is governed by the activity of single neurons in the primate cerebral cortex, but far less progress has been made in understanding both auditory motion and audiovisual motion integration. Here we, review the key cortical regions for motion processing, focussing on translational motion. We compare the representations of space and motion in the visual and auditory systems, and examine how single neurons in these two sensory systems encode the direction of motion. We also discuss the way in which humans integrate of audio and visual motion cues, and the regions of the cortex that may mediate this process.
Topics: Acoustic Stimulation; Animals; Auditory Perception; Cerebral Cortex; Humans; Motion Perception; Photic Stimulation; Primates
PubMed: 30416431
DOI: 10.3389/fncir.2018.00093 -
Experimental Psychology Mar 2022Representational Momentum and Representational Gravity describe systematic perceptual biases, occurring for the localization of the final location of a moving stimulus....
Representational Momentum and Representational Gravity describe systematic perceptual biases, occurring for the localization of the final location of a moving stimulus. While Representational Momentum describes the systematic overestimation along the motion trajectory (forward shift), Representational Gravity refers to a systematic localization bias in line with gravitational force (downward shift). Those phenomena are typically investigated in a laboratory setting, and while previous research has shown that online studies perform well for different task, motion perception outside of the laboratory was not focused to date. Therefore, one experiment was conducted in two different settings: in a typical, highly controlled laboratory setting and in an online setting of the participants' choosing. In both experiments, the two most common trial types, implied motion stimuli and continuously moving stimuli, were used, and the influence of classical velocity manipulations (by varying stimulus timing and distance) was assessed. The data pattern across both experiments was very similar, indicating a robustness of both phenomena and indicating that motion perception can very well be studied outside the classical laboratory setting, opening a feasible possibility to diversify access to motion perception experiments everywhere.
Topics: Bias; Gravitation; Humans; Motion Perception
PubMed: 35726674
DOI: 10.1027/1618-3169/a000545 -
Journal of Physiological Anthropology Apr 2018Falls are the leading cause of accidental injury and death among older adults. One of three adults over the age of 65 years falls annually. As the size of elderly... (Review)
Review
BACKGROUND
Falls are the leading cause of accidental injury and death among older adults. One of three adults over the age of 65 years falls annually. As the size of elderly population increases, falls become a major concern for public health and there is a pressing need to understand the causes of falls thoroughly. While it is well documented that visual functions such as visual acuity, contrast sensitivity, and stereo acuity are correlated with fall risks, little attention has been paid to the relationship between falls and the ability of the visual system to perceive motion in the environment. The omission of visual motion perception in the literature is a critical gap because it is an essential function in maintaining balance. In the present article, we first review existing studies regarding visual risk factors for falls and the effect of ageing vision on falls. We then present a group of phenomena such as vection and sensory reweighting that provide information on how visual motion signals are used to maintain balance.
CONCLUSION
We suggest that the current list of visual risk factors for falls should be elaborated by taking into account the relationship between visual motion perception and balance control.
Topics: Accidental Falls; Aged; Aged, 80 and over; Aging; Humans; Motion Perception; Postural Balance; Risk; Visual Perception
PubMed: 29685171
DOI: 10.1186/s40101-018-0170-1 -
Journal of Experimental Psychology.... Jan 2022Perceiving the motion of an object is thought to involve two stages: Local motion energy is measured at each point in space, and these signals are then pooled across...
Perceiving the motion of an object is thought to involve two stages: Local motion energy is measured at each point in space, and these signals are then pooled across space to build coherent global motion. There are several theories of how local-to-global pooling occurs, but they all predict that global motion perception is a continuous process, such that increasing the strength of motion energy should gradually increase the precision of perceived motion directions. We test this prediction against the alternative that global motion perception is discrete: Motion is either perceived with high precision or fails to be perceived altogether. Data from human observers provides clear evidence that, whereas pooling local motion energy is continuous, the segmentation of local signals into coherent global motion patterns is a discrete process. This result adds motion perception to the growing list of processes that exhibit evidence of all-or-none visual awareness. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
Topics: Humans; Motion Perception; Photic Stimulation
PubMed: 35073143
DOI: 10.1037/xhp0000971 -
Annual Review of Neuroscience Jul 2017Images projected onto the retina of an animal eye are rarely still. Instead, they usually contain motion signals originating either from moving objects or from retinal... (Review)
Review
Images projected onto the retina of an animal eye are rarely still. Instead, they usually contain motion signals originating either from moving objects or from retinal slip caused by self-motion. Accordingly, motion signals tell the animal in which direction a predator, prey, or the animal itself is moving. At the neural level, visual motion detection has been proposed to extract directional information by a delay-and-compare mechanism, representing a classic example of neural computation. Neurons responding selectively to motion in one but not in the other direction have been identified in many systems, most prominently in the mammalian retina and the fly optic lobe. Technological advances have now allowed researchers to characterize these neurons' upstream circuits in exquisite detail. Focusing on these upstream circuits, we review and compare recent progress in understanding the mechanisms that generate direction selectivity in the early visual system of mammals and flies.
Topics: Animals; Humans; Motion; Motion Perception; Neurons; Retina; Visual Pathways
PubMed: 28418757
DOI: 10.1146/annurev-neuro-072116-031335 -
Nature Reviews. Neuroscience Jun 2019How the brain computes accurate estimates of our self-motion relative to the world and our orientation relative to gravity in order to ensure accurate perception and... (Review)
Review
How the brain computes accurate estimates of our self-motion relative to the world and our orientation relative to gravity in order to ensure accurate perception and motor control is a fundamental neuroscientific question. Recent experiments have revealed that the vestibular system encodes this information during everyday activities using pathway-specific neural representations. Furthermore, new findings have established that vestibular signals are selectively combined with extravestibular information at the earliest stages of central vestibular processing in a manner that depends on the current behavioural goal. These findings have important implications for our understanding of the brain mechanisms that ensure accurate perception and behaviour during everyday activities and for our understanding of disorders of vestibular processing.
Topics: Animals; Humans; Motion Perception; Movement; Neural Pathways; Space Perception; Vestibule, Labyrinth
PubMed: 30914780
DOI: 10.1038/s41583-019-0153-1