Attending to auditory signals slows visual alternations in binocular rivalry.
Journal - Vision research
A previous study has shown that diverting attention from binocular rivalry to a visual distractor task results in a slowing of rivalry alternation rate between simple orthogonal orientations. Here, we investigate whether the slowing of visual perceptual alternations will occur when attention is diverted to an auditory distractor task, and we extend the investigation by testing this for two kinds of binocular rivalry stimuli and for the Necker cube. Our results show that doing the auditory attention task does indeed slow visual perceptual alternations, that the slowing effect is a graded function of attentional load, and that the attentional slowing effect is less pronounced for grating rivalry than for house/face rivalry and for the Necker cube. These results are explained in terms of supramodal attentional resources modulating a high-level interpretative process in perceptual ambiguity, together with a role for feedback to early visual processes in the case of binocular rivalry.Copyright © 2010. Published by Elsevier Ltd.
Multisensory processing in review: from physiology to behaviour.
Journal - Seeing and perceiving (Netherlands )
Research in multisensory processes has exploded over the last decade. Tremendous advances have been made in a variety of fields from single-unit neural recordings and functional brain imaging through to behaviour, perception and cognition. These diverse approaches have highlighted how the senses work together to produce a coherent multimodal representation of the external world that enables us to function better by exploiting the redundancies and complementarities provided by multiple sensory modalities. With large numbers of new students and researchers being attracted to multisensory research, and the multi-disciplinary nature of the work, our aim in this review is to provide an overview of multisensory processing that includes all fields in a single review. Our intention is to provide a comprehensive source for those interested in learning about multisensory processes, covering a variety of sensory combinations and methodologies, and tracing the path from single-unit neurophysiology through to perception and cognitive functions such as attention and speech.
Multisensory perceptual learning of temporal order: audiovisual learning transfers to vision but not audition.
Journal - PloS one (United States )
BACKGROUND: An outstanding question in sensory neuroscience is whether the perceived timing of events is mediated by a central supra-modal timing mechanism, or multiple modality-specific systems. We use a perceptual learning paradigm to address this question. METHODOLOGY/PRINCIPAL FINDINGS: Three groups were trained daily for 10 sessions on an auditory, a visual or a combined audiovisual temporal order judgment (TOJ). Groups were pre-tested on a range TOJ tasks within and between their group modality prior to learning so that transfer of any learning from the trained task could be measured by post-testing other tasks. Robust TOJ learning (reduced temporal order discrimination thresholds) occurred for all groups, although auditory learning (dichotic 500/2000 Hz tones) was slightly weaker than visual learning (lateralised grating patches). Crossmodal TOJs also displayed robust learning. Post-testing revealed that improvements in temporal resolution acquired during visual learning transferred within modality to other retinotopic locations and orientations, but not to auditory or crossmodal tasks. Auditory learning did not transfer to visual or crossmodal tasks, and neither did it transfer within audition to another frequency pair. In an interesting asymmetry, crossmodal learning transferred to all visual tasks but not to auditory tasks. Finally, in all conditions, learning to make TOJs for stimulus onsets did not transfer at all to discriminating temporal offsets. These data present a complex picture of timing processes. CONCLUSIONS/SIGNIFICANCE: The lack of transfer between unimodal groups indicates no central supramodal timing process for this task; however, the audiovisual-to-visual transfer cannot be explained without some form of sensory interaction. We propose that auditory learning occurred in frequency-tuned processes in the periphery, precluding interactions with more central visual and audiovisual timing processes. Functionally the patterns of featural transfer suggest that perceptual learning of temporal order may be optimised to object-centered rather than viewer-centered constraints.
Helping the visual system find its target Comment on "Crossmodal influences on visual perception" by L. Shams & R. Kim.
Journal - Physics of life reviews
Strength and coherence of binocular rivalry depends on shared stimulus complexity.
Journal - Vision research (England )
Presenting incompatible images to the eyes results in alternations of conscious perception, a phenomenon known as binocular rivalry. We examined rivalry using either simple stimuli (oriented gratings) or coherent visual objects (faces, houses etc). Two rivalry characteristics were measured: Depth of rivalry suppression and coherence of alternations. Rivalry between coherent visual objects exhibits deep suppression and coherent rivalry, whereas rivalry between gratings exhibits shallow suppression and piecemeal rivalry. Interestingly, rivalry between a simple and a complex stimulus displays the same characteristics (shallow and piecemeal) as rivalry between two simple stimuli. Thus, complex stimuli fail to rival globally unless the fellow stimulus is also global. We also conducted a face adaptation experiment. Adaptation to rivaling faces improved subsequent face discrimination (as expected), but adaptation to a rivaling face/grating pair did not. To explain this, we suggest rivalry must be an early and local process (at least initially), instigated by the failure of binocular fusion, which can then become globally organized by feedback from higher-level areas when both rivalry stimuli are global, so that rivalry tends to oscillate coherently. These globally assembled images then flow through object processing areas, with the dominant image gaining in relative strength in a form of 'biased competition', therefore accounting for the deeper suppression of global images. In contrast, when only one eye receives a global image, local piecemeal suppression from the fellow eye overrides the organizing effects of global feedback to prevent coherent image formation. This indicates the primacy of local over global processes in rivalry.
|ISSN : ||0042-6989|
|Mesh Heading : ||Adaptation, Psychological Feedback, Psychological Humans Photic Stimulation Psychophysics|
|Mesh Heading Relevant : ||Discrimination (Psychology) Face Vision Disparity|
Independent binocular rivalry processes for motion and form.
Journal - Neuron (United States )
During binocular rivalry, conflicting monocular images undergo alternating suppression. This study explores rivalry suppression by probing visual sensitivity during rivalry with various probe stimuli. When two faces engage in rivalry, sensitivity to face probes is reduced 4-fold during suppression. Rivaling global motions also rivaled very deeply when probed with a global motion. However, in a surprising finding, sensitivity to face probes is completely unimpaired during global motion rivalry, and motion sensitivity is unimpaired during face rivalry. This suggests that rivalry suppression is localized to the neurons representing the image conflict, which means that probes of a different kind suffer no suppression. Sensibly, this would leave visual processes not involved in rivalry free to function normally.
|ISSN : ||0896-6273|
|Mesh Heading : ||Face Form Perception Functional Laterality Humans Motion Perception Neurons Photic Stimulation Recognition (Psychology) Vision, Binocular Vision, Monocular Visual Pathways physiology physiology physiology physiology|
|Mesh Heading Relevant : ||physiology physiology physiology physiology|
Compression of auditory space during rapid head turns
Journal - PNAS
Studies of spatial perception during visual saccades have demonstratedcompressions of visual space around the saccade target. Herewe psychophysically investigated perception of auditory spaceduring rapid head turns, focusing on the "perisaccadic" interval.Using separate perceptual and behavioral response measures weshow that spatial compression also occurs for rapid head movements,with the auditory spatial representation compressing by up to50%. Similar to observations in the visual system, this occurredonly when spatial locations were measured by using a perceptualresponse; it was absent for the behavioral measure involvinga nose-pointing task. These findings parallel those observedin vision during saccades and suggest that a common neural mechanismmay subserve these distortions of space in each modality.
The authors declare no conflict of interest.
This article is a PNAS Direct Submission. M.G. is a guest editorinvited by the Editorial Board.© 2008 by The National Academy of Sciences of the USA
|Keywords : ||action and perception • auditory localization • head motion • saccades • spatial perception|