Although much traditional sensory research has studied each sensory modality in isolation, there has been a recent explosion of interest in causal interplay between different senses. Various techniques have now identified numerous multisensory convergence zones in the brain. Some convergence may arise surprisingly close to low-level sensory-specific cortex, and some direct connections may exist even between primary sensory cortices. A variety of multisensory phenomena have now been reported in which sensory-specific brain responses and perceptual judgments concerning one sense can be affected by relations with other senses. We survey recent progress in this multisensory field, foregrounding human studies against the background of invasive animal work and highlighting possible underlying mechanisms. These include rapid feedforward integration, possible thalamic influences, and/or feedback from multisensory regions to sensory-specific brain areas. Multisensory interplay is more prevalent than classic modular approaches assumed, and new methods are now available to determine the underlying circuits.
When the apparent visual location of a body part conflicts with its veridical location, vision can dominate proprioception and kinesthesia. In this article, we show that vision can capture tactile localization. Participants discriminated the location of vibrotactile stimuli (upper, at the index finger, vs. lower, at the thumb), while ignoring distractor lights that could independently be upper or lower. Such tactile discriminations were slowed when the distractor light was incongruent with the tactile target (e.g., an upper light during lower touch) rather than congruent, especially when the lights appeared near the stimulated hand. The hands were occluded under a table, with all distractor lights above the table. The effect of the distractor lights increased when rubber hands were placed on the table, "holding" the distractor lights, but only when the rubber hands were spatially aligned with the participant's own hands. In this aligned situation, participants were more likely to report the illusion of feeling touch at the rubber hands. Such visual capture of touch appears cognitively impenetrable.
Covert orienting in hearing was examined by presenting auditory spatial cues prior to an auditory target, requiring either a choice or detection response. Targets and cues appeared on the left or right of Ss' midline. Localization of the target in orthogonal directions (up vs. down or front vs. back, independent of target side) was faster when cue and target appeared on the same rather than opposite sides. This benefit was larger and more durable when the cue predicted target side. These effects cannot reflect criterion shifts, suggesting that covert orienting enhances auditory localization. Fine frequency discriminations also benefited from predictive spatial cues, although uninformative cues only affected spatial discriminations. No cuing effects were observed in a detection task.This research was supported by grants from the Medical Research Council (England). We thank Philip Quinlan for stimulating discussions on these experiments and on his own related work with
In 5 experiments, it was found that judging the relative location of 2 contours was more difficult when they belonged to 2 objects rather than 1. This was observed even when the 1-and 2-object displays were physically identical, with perceptual set determining how many objects they were seen to contain. Such a 2-object cost is consistent with object-based views of attention and with a hierarchical scheme for position coding, whereby object parts are located relative to the position of their parent object. In further experiments, it was shown that in accord with this hierarchical scheme, the relative location of objects could disrupt judgments of the relative location of object parts, but the reverse did not occur. This was found even when the relative position of the parts could be judged more quickly than that of the objects.
Perceptual suppression of distractors may depend on both endogenous and exogenous factors, such as attentional load of the current task and sensory competition among simultaneous stimuli, respectively. We used functional magnetic resonance imaging (fMRI) to compare these two types of attentional effects and examine how they may interact in the human brain. We varied the attentional load of a visual monitoring task performed on a rapid stream at central fixation without altering the central stimuli themselves, while measuring the impact on fMRI responses to task-irrelevant peripheral checkerboards presented either unilaterally or bilaterally. Activations in visual cortex for irrelevant peripheral stimulation decreased with increasing attentional load at fixation. This relative decrease was present even in V1, but became larger for successive visual areas through to V4. Decreases in activation for contralateral peripheral checkerboards due to higher central load were more pronounced within retinotopic cortex corresponding to 'inner' peripheral locations relatively near the central targets than for more eccentric 'outer' locations, demonstrating a predominant suppression of nearby surround rather than strict 'tunnel vision' during higher task load at central fixation. Contralateral activations for peripheral stimulation in one hemifield were reduced by competition with concurrent stimulation in the other hemifield only in inferior parietal cortex, not in retinotopic areas of occipital visual cortex. In addition, central attentional load interacted with competition due to bilateral versus unilateral peripheral stimuli specifically in posterior parietal and fusiform regions. These results reveal that task-dependent attentional load, and interhemifield stimulus-competition, can produce distinct influences on the neural responses to peripheral visual stimuli within the human visual system. These distinct mechanisms in selective visual processing may be integrated within posterior parietal areas, rather than earlier occipital cortex.
There has been a recent and dramatic growth of interest in the psychological and neural mechanisms of multisensory integration between different sensory modalities. Much of this recent research has focused specifically on how multisensory representations of body parts and of the 'peripersonal' space immediately around them, are constructed. Research has also focused on how this may lead to multisensorially determined perceptions of body parts, to action execution, and even to attributions of agency and self-ownership for the body parts in question. Converging evidence from animal and human studies suggests that the primate brain constructs various body-part-centred representations of space, based on the integration of visual, tactile and proprioceptive information. These representations can plastically change following active tool-use that extends reachable space and also modifies the representation of peripersonal space. These new results indicate that a modern cognitive neuroscience approach to the classical concept of the 'body schema' may now be within reach.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.