Sie sind auf Seite 1von 5

Cross-Modal Interactions Between Vision and Touch 259

Cross-Modal Interactions Between Vision and Touch


K Sathian and S Lacey, Emory University, Atlanta, GA, USA
2009 Elsevier Ltd. All rights reserved.

Introduction
Although this might seem heretical, it is now generally accepted that many extrastriate visual cortical areas are active during tactile perception, in both the sighted and the blind. Here we review the literature on such cross-modal recruitment of visual cortex during tactile perception. We consider the attempts that have been made to address a variety of issues that remain largely unresolved, including why such cross-modal recruitment of visual cortex occurs, whether the underlying reasons differ in the sighted and the blind, the nature of the underlying perceptual representations, and how they relate to performance differences between the blind and sighted.

Involvement of Visual Cortical Areas in Tactile Perception in the Sighted


Extrastriate visual cortical areas were first shown to be active during tactile perception in a positron emission tomography (PET) study conducted in our laboratory on normally sighted human volunteers. In this study, the experimental task was tactile discrimination of the orientation of gratings presented to the immobilized fingerpad. When this task was contrasted with a control task requiring tactile discrimination of grating groove width, activation was found at a focus in extrastriate visual cortex, close to the parieto-occipital fissure. This focus is in the vicinity of the human V6 complex of visual areas and had previously been found to be active during visual discrimination of grating orientation. The possibility that this cross-modal parieto-occipital activation might be merely epiphenomenal was ruled out in a subsequent study from our laboratory, using transcranial magnetic stimulation (TMS) to test whether disrupting the function of this cortical region was detrimental to performance. TMS applied directly over the parieto-occipital focus, and at sites close to it, significantly impaired discrimination of grating orientation when the magnetic pulses were appropriately timed with respect to the stimuli. However, TMS at or near the parietooccipital focus did not affect discrimination of grating groove width, and when applied at more distant sites did not affect performance on either task. This pattern of results permitted the inference that the

activation found in the earlier PET study was functionally meaningful. Analogous findings were reported in a study using repetitive TMS (rTMS) at 1 Hz to decrease local cortical excitability. This study took advantage of a dissociation between subjective magnitude estimates of perceived interdot distance, which increase with physical interdot distance (up to 8 mm), and those of perceived roughness, which peak at intermediate values (around 3 mm). The main finding was that rTMS over the primary somatosensory cortex (S1) affected judgments of roughness but not interdot distance, whereas rTMS over the medial occipital cortex affected distance but not roughness judgments. Further, a congenitally blind patient who suffered a stroke that damaged the occipital cortex bilaterally was impaired on judgments of interdot spacing but not roughness. These observations, together with the studies of the grating orientation task reviewed in the previous paragraph, indicate that visual cortical involvement occurs in macrospatial (large-scale) tactile tasks but not in microspatial tasks in which the parameters of interest vary in the range of 3 mm. This conclusion is consistent with psychophysical studies suggesting that macrospatial tactile tasks are preferentially associated with visual processing and that vision seems to be better than touch for perceiving macrospatial features, the reverse being true for microspatial features. Another example of visual cortical recruitment during tactile perception is offered by work showing that the MT complex, an area located in the dorsal (visuospatial) pathway and specifically involved in perception of visual motion, is also active during presentation of tactile motion stimuli, even in the absence of any task. Psychophysical studies suggest that both vision and touch may engage a common representation of motion; tactile motion perception can be used to disambiguate the direction of motion in an ambivalent visual motion display. Within the ventral visual pathway, which is specialized for form processing, the lateral occipital complex (LOC) is a region that is object-selective for visual stimuli. The LOC is also recruited cross-modally during haptic object recognition, as revealed by numerous functional magnetic resonance imaging (fMRI) studies. Further, activation has been reported in the LOC during tactile perception of two-dimensional forms presented to the immobilized right index fingerpad, and studies of patients with lesions involving the LOC suggest that it is necessary for haptic as well as visual shape perception. Multisensory object-selective activity in the LOC is stronger for graspable visual objects than for other visual stimuli and does not appear to

260 Cross-Modal Interactions Between Vision and Touch

extend to the auditory modality. These studies, along with cross-modal priming effects observed in psychophysical and fMRI studies, cross-modal interference effects in memory tasks, and category-specific representations that overlap between vision and touch, imply that both visual and haptic shape perception may engage a unitary neural representation, as in the case of motion perception.

Visual Imagery and Multisensory Representations


The research reviewed so far has established that visual cortical processing is routinely involved in normal tactile perception in the sighted, especially during macrospatial tasks. Such processing is quite task-specific, so the extrastriate visual cortical areas known to be specialized for particular visual tasks are involved during performance of the same tasks tactually. An obvious question is whether visual imagery underlies this cross-modal recruitment of visual cortex into nonvisual tasks. Subjects in our laboratory consistently report mental visualization of tactile stimuli during macrospatial tasks such as discrimination of grating orientation or tactile form, which are associated with visual cortical recruitment, but not during microspatial tasks such as discrimination of grating groove width or gap detection, which do not tend to involve visual cortical activity. Visual imagery might be triggered by lack of familiarity with the tactile stimuli or tasks used; indeed, such cross-modal translation may be a general phenomenon, especially during complex information processing. Left-lateralized LOC activity has been found during mental imagery of shape generated from previous visual encounters in the sighted or from haptic encounters in the blind. An fMRI study from our laboratory found that inter-individual variations in the magnitude of haptic shape-selective activity in the right LOC (ipsilateral to the hand used for haptic perception) were strongly predicted by ratings of the vividness of visual imagery. In contrast, activation strengths in the left LOC were uncorrelated with these visual imagery scores, pointing to a role for factors other than visual imagery in cross-modal visual cortical recruitment. Despite the differences between these studies in the lateralization of imagery effects, they suggest that a role for visual imagery in cross-modal visual cortical recruitment cannot be ruled out. However, some have argued that LOC activity during haptic perception is not due to visual imagery. Vision and touch can each encode object shape and motion, and visual cortical activation during tactile perception might reflect a multisensory representation rather than a specifically visual image, especially

given the evidence reviewed earlier for common representations of motion and shape across vision and touch. That similar illusions can be experienced with both visual and tactile inputs support this idea. In the Mu llerLyer illusion, two lines of identical length and endcapped with fins that point either outward or inward are seen as shorter or longer, respectively, than the true length. This visual illusion has also been demonstrated in a haptic format and appears to be independent of visual experience because blindfolded, normally sighted individuals, those with low vision, late blind subjects, and congenitally blind individuals are all equally susceptible to the illusion. In the same vein, length distortion is an illusion that was first described haptically and subsequently found to occur in vision too. Length distortion refers to the overestimation of the distance between two points as a function of the increasing length of an indirect pathway between them. The same effect can be observed, albeit of a smaller magnitude, in the visual version of the task. These studies suggest a common visuo-haptic representation of space that may be based on bodycentered reference points (e.g., the body midline or the positions of the hands relative to one another) because instructions to use these in judging Mu llerLyer line length virtually eliminated the illusion in both modalities whereas external reference points (a frame surrounding the illusion figure) had no effect. The use of these body-centered reference points may be a common mechanism for integrating spatial information from vision and touch. The evidence for multisensory representations suggests the existence of direct, bottom-up somatosensory projections into the visual cortical areas that are implicated in tactile perception, as opposed to the top-down projections from the prefrontal into visual cortical areas that would be required to support a process such as visual imagery. These competing possibilities can be tested empirically by studying connectivity in experimental animals, by electrophysiological studies in animals or humans to pinpoint the timing of activity in different regions, or by analyzing fMRI data to reveal effective connectivity. A study from our laboratory using structural equation modeling of effective connectivity, based on the correlation matrix between the time courses of fMRI activity in various regions, revealed both bottom-up and top-down paths in a network of haptically shapeselective areas, including foci in the postcentral sulcus (corresponding to Brodmanns area 2 of S1), the intraparietal sulcus (IPS), and the LOC. This suggests that potential neural substrates exist for both visual imagery and multisensory representations, in relation to visual cortical involvement during haptic shape perception.

Cross-Modal Interactions Between Vision and Touch 261

Neurophysiological and neuroanatomic studies in monkeys have also illuminated our understanding of the mechanisms involved in cross-modal visual cortical recruitment. Some neurons in area V4 (a nonprimary area in the ventral visual pathway) were found to be selective for the orientation of a tactile grating when it served as a cue for matching to a subsequently presented visual stimulus but not when it was task-irrelevant. The requirement for taskrelevance implicates top-down, rather than bottomup, inputs. Multisensory inputs have been reported in early sensory cortical areas that are traditionally considered unisensory, including the primary visual cortex (V1) and auditory association cortex. Analysis of the laminar profile of these projections suggests that both top-down and bottom-up inputs are present. Thus, the weight of evidence indicates the existence of multisensory representations that are flexibly accessible via both vision and touch, potentially involving interactions between bottom-up sensory inputs and top-down processes such as visual imagery. However, empirical studies addressing the format of such representations are in their infancy, and currently we cannot rule out as an alternative to unified multisensory representations that there might be independently derived spatial representations for vision and touch that can interact at later stages of processing (e.g., in cross-modal comparison). If vision and touch produce separate representations, how do these affect one another? The classic work of Rock and Victor suggested that vision dominates touch; when the visual width of an object was distorted by a lens, individuals who both saw and felt the object were unaware of the cross-modal conflict and reported the distorted width perceived visually rather than the true width perceived tactually. Thus, this sensory conflict was resolved in favor of the visual percept. More recent research by Klatzky, Lederman, and their colleagues showed the situation to be more complex and to depend on the specific object property. For example, under neutral instructions the haptic salience of object properties is hardness > texture > shape; under haptically biased instructions this salience changes to texture > shape > hardness. But in combined visual and haptic perception, as well as in haptic perception with concurrent visual imagery, this saliency is reversed and changes to shape > texture > hardness/size. Visual imagery can therefore influence tactile perception in the absence of visual perception, and moreover, different modalities may be optimal for different object properties. Thus, there is not a simple dominance of visual over tactile perception. Other approaches have also identified interactions between vision and somatosensory representations of the body. One line of work has demonstrated that

noninformative vision of the body can improve tactile spatial acuity as measured using grating orientation discrimination, and that this cross-modal enhancement is abolished by TMS over the primary somatosensory cortex. Another line of research pertains to the rubber hand illusion when a rubber hand is positioned in alignment with the subjects own concealed hand, if the subject sees the rubber hand being touched, he or she experiences illusory touch, indicating a feeling of ownership of the synthetic hand. Such a feeling of ownership is associated with activity in the ventral premotor cortex, a region where, in monkeys, neurons exhibit visual receptive fields that are anchored to the arm and move with it.

Tactile Perception in the Blind and Its Involvement of Visual Cortex


Despite the common belief that blindness is associated with superior nonvisual perception, research on this has not yielded uniform results. Some studies of haptic perception of three-dimensional shape have observed better performance in the blind, others have shown equal performance in sighted and blind individuals, and still others, even better performance in the sighted. It has been argued that superior performance of the blind on haptic tasks may stem not from heightened sensitivity but, rather, from using more efficient sensorimotor strategies or from practice in attending to cues that the sighted typically ignore. However, blind Braille readers out-perform the sighted on many tests involving tactile perception of twodimensional patterns with the fingerpad. These include tactile recognition of Braille-like dot patterns, judging the orientation of bars or detecting gaps in them, discriminating grating orientation, and determining whether the central dot in a linear three-dot array is offset laterally by a small (<1 mm) distance. In contrast, blind individuals do not differ significantly from the sighted in their ability to discriminate bar length or laboratory textures such as gratings or real-world textures such as sandpapers or abrasive stones. There is some evidence that the superior tactile spatial performance of the blind (when it does occur) is due to practice. Thus, practiced sighted subjects do as well as the blind on the offset task and at discriminating patterns generated with the Optacon (a vibrotactile reading aid for the blind). They can also match the deaf-blind in decoding speech by the Tadoma method (feeling the speakers face and neck). The role of specific practice with reading Braille is unclear; one study reported that the grating orientation discrimination threshold was lowest on the Braille-reading finger, whereas another study found no significant difference on this test between blind subjects who read Braille and

262 Cross-Modal Interactions Between Vision and Touch

those who did not and that offset detection was no better on the hand used for reading Braille than the other hand. Visual deprivation studies have investigated whether superior performance in the blind may be due to cortical reorganization. Following neonatal visual deprivation, rodents navigate a maze for a food reward faster than normal and demonstrate altered somatosensory cortical representations of the whiskers as well as novel somatosensory responsiveness in the anterior occipital cortex. In monkeys whose eyelids were sutured shut for the first year of life and opened later, the dorsal extrastriate visual cortex had lower visual but greater somatosensory responsiveness than in the controls, although this procedure did not affect haptic discrimination ability. The human somatosensory cortex also reorganizes after visual deprivation, probably as a consequence of increased hand use. Expansion of the representation of the Braille-reading finger has been reported in a study using evoked potentials and TMS. Others have found disordered cortical somatotopy as revealed by magneto-encephalography (MEG), and mislocalizations of touch on the reading fingers, in subjects who read Braille using multiple fingers. In an early report of cross-modal plasticity in blind humans, PET scanning was used to show that visual cortical areas in the occipital lobe were more active metabolically in the early blind than in the late blind or sighted. This was interpreted as an index of greater synaptic activity in the early blind, perhaps reflecting incomplete developmental pruning of synapses. Subsequently, many PET and fMRI studies found that the medial occipital cortical regions of blind subjects were recruited during tactile reading of Braille or embossed Roman letters. The medial occipital cortical activations were specific to early blind subjects, in contrast to late blind and sighted subjects, who deactivated these regions. Complementary rTMS studies demonstrated functional involvement of the medial occipital cortex in tactile identification of Braille or Roman characters by blind subjects but not in sighted subjects for Roman letters. As in the activation studies, the rTMS effects over the medial occipital cortex were found in the early blind but not the late blind, implicating cross-modal plasticity during a critical period of visual development. The functional role of the visual cortex in Braille reading is supported by the report of an early blind person who became unable to read Braille after a bilateral infarct of the occipital cortex. Was the visual cortical activation observed during Braille reading in the blind due to sensory or higherorder cognitive processes such as attention and language? Most of the studies cited in the preceding paragraph could not resolve this because the tactile reading tasks were compared to low-level conditions.

It turns out that visual cortical activity occurs in the blind during a variety of language tasks, including generating verbs in response to tactile or auditory input, and verbal recall in the absence of any sensory input. Such activity is more extensive in the early blind than the late blind. Its functional relevance is shown by its greater strength during semantic than phonological processing and by the correlation of its magnitude with semantic and syntactic complexity and with verbal recall ability. The conclusion from these studies is that the visual cortex is clearly active during language processing in the blind. However, the extent of its involvement in tactile processing is still unclear, as is its potential relationship to performance differences between blind and sighted individuals. Compared to activity during Braille-reading, there was less activation in the medial occipital cortex of the blind during discrimination of angles or the width of grooves cut in homogeneous Braille fields. Vibrotactile frequency discrimination evoked extensive responses in the visual cortex in congenitally blind subjects, with less extensive responses in the late blind. Category-selectivity during haptic perception of three-dimensional form, similar to that noted in sighted subjects, was found in the inferotemporal cortex of blind subjects, although the categoryselective voxels were located more ventrally in the blind than in sighted subjects. Unfortunately, most of these studies used rest controls, making it difficult to disentangle sensory from linguistic processing. A study of Braille-reading that did attempt to control linguistic processes, using an auditory word control, actually found medial occipital cortical activity in the late blind but not the early blind. Thus, more research is needed to address the extent to which tactile processing engages the visual cortex in the blind and its links with cross-modal recruitment in the sighted.
See also: Multisensory Convergence and Integration; Neural Circuitry in the Somatosensory System; Sensorimotor Integration: Models; Somatosensory Perception; Somatosensory Cortex: Functional Architecture; Somatosensory Receptive Fields; Somatosensory Cortex; Tactile Coding in Peripheral Neural Populations; Tactile Texture; Vision for Action and Perception; Visual Deprivation; Visual Cortical Models of Orientation Tuning.

Further Reading
Amedi A, von Kriegstein K, van Atteveldt NM, Beauchamp MS, and Naumer MJ (2005) Functional imaging of human crossmodal identification and object recognition. Experimental Brain Research 166: 559571. Balakrishnan JD, Klatzky RL, Loomis JM, and Lederman SJ (1989) Length distortion of temporally extended visual displays: Similarity to haptic spatial perception. Perception and Psychophysics 46: 387394.

Cross-Modal Interactions Between Vision and Touch 263


Botvinick M and Cohen J (1998) Rubber hands feel touch that eyes see. Nature 391: 756. Ehrsson HH, Spence C, and Passingham RE (2004) Thats my hand!: Activity in premotor cortex reflects feeling of ownership of a limb. Science 305: 875877. Fiorio M and Haggard P (2005) Viewing the body prepares the brain for touch: Effects of TMS over somatosensory cortex. European Journal of Neuroscience 22: 773777. Heller MA, Brackett DD, Wilson K, Yoneyama K, and Boyer A (2002) The haptic Mu ller-Lyer illusion in sighted and blind people. Perception 31: 12631274. Hollins M (1989) Understanding Blindness. Hillsdale, NJ: Lawrence Erlbaum. Klatzky RL, Lederman SJ, and Reed C (1987) Theres more to touch than meets the eye: The salience of object attributes for haptics with and without vision. Journal of Experimental Psychology: General 116: 356369. Merabet L, Thut G, Murray B, et al. (2004) Feeling by sight or seeing by touch? Neuron 42: 173179. Peltier S, Stilla R, Mariola E, LaConte S, Hu X, and Sathian K (2007) Activity and effective connectivity of parietal and occipital cortical regions during haptic shape perception. Neuropsychologia 45: 476483. Rock I and Victor J (1964) Vision and touch: An experimentally created conflict between the two senses. Science 143: 594596. Sathian K (2005) Cross-modal plasticity in sensory systems. In: Selzer ME, Clarke S, Cohen L, Duncan P, and Gage F (eds.) Textbook of Neural Repair and Rehabilitation, pp. 180193. Cambridge, UK: Cambridge University Press. Sathian K (2005) Visual cortical activity during tactile perception in the sighted and visually deprived. Developmental Psychobiology 46: 279286. Sathian K, Zangaladze A, Hoffman JM, and Grafton ST (1997) Feeling with the minds eye. NeuroReport 8: 38773881. Zangaladze A, Epstein CM, Grafton ST, and Sathian K (1999) Involvement of visual cortex in tactile discrimination of orientation. Nature 401: 587590.

Das könnte Ihnen auch gefallen