Research has suggested that altering the perceived shape and size of the body image significantly affects perception of somatic events. The current study investigated how multisensory illusions applied to the body altered tactile perception using the somatic signal detection task. Thirty-one healthy volunteers were asked to report the presence or absence of near-threshold tactile stimuli delivered to the index finger under three multisensory illusion conditions: stretched finger, shrunken finger and detached finger, as well as a veridical baseline condition. Both stretching and shrinking the stimulated finger enhanced correct touch detections; however, the mechanisms underlying this increase were found to be different. In contrast, the detached appearance reduced false touch reports-possibly due to reduced tactile noise, as a result of attention being directed to the tip of the finger only. These findings suggest that distorted representations of the body could have different modulatory effects on attention to touch and provide a link between perceived body representation and somatosensory decision-making.
The rubber hand illusion (RHI) is a perceptual phenomenon in which participants experience ownership over a fake model hand through synchronous visuotactile stimulation. Several studies have shown that the illusion occurs only when both hands are in close proximity to each other. In the present study, we systematically examined the role of relative position (lateral, distal) and distance (13-75 cm) of the model hand (with respect to participants' real hand) on illusion experience across both lateral and distal positions. Furthermore, we also compared different facets of the subjective illusion experience; the experience of the model hand being part of one's body (i.e., ownership) and the perceptual fusion of vision and touch (i.e., referral of touch). In two experiments we observed indications for a stronger illusion experiences in distal compared to lateral positions of identical distances, indicating that the illusory effects may vary as a function of the relative position of the hand. Our results also showed that manipulations of distance differently modulated both facets of the illusion. While ownership was restricted to near distances, referral of touch sensations remained stable at farther distances. These results are interpreted in relation to variations in sensory weighting across different planes.
The rubber hand illusion is a perceptual illusion of perceiving an object like a model hand as part of the own body. The question whether the illusion can be induced with noncorporal objects that do not look like a human body part is not perfectly resolved yet. In this study, we directly assessed the subjective experience of two different components within the illusion (i.e., ownership and referral of touch) when a model hand and a balloon are stimulated. We observed significantly stronger illusion ratings for the hand as compared with the balloon, and only the hand ratings showed a clear affirmation of the illusion. We further conclude that (a) a significant difference between synchronous and asynchronous conditions may not be sufficient to argue for the successful induction of the illusion and (b) the subcomponents show a different pattern in the different conditions, which may lead to alternative interpretations. These observations call for a more fine-grained interpretation of questionnaire data in rubber hand illusion studies.
Is there a universal hierarchy of the senses, such that some senses (e.g., vision) are more accessible to consciousness and linguistic description than others (e.g., smell)? The long-standing presumption in Western thought has been that vision and audition are more objective than the other senses, serving as the basis of knowledge and understanding, whereas touch, taste, and smell are crude and of little value. This predicts that humans ought to be better at communicating about sight and hearing than the other senses, and decades of work based on English and related languages certainly suggests this is true. However, how well does this reflect the diversity of languages and communities worldwide? To test whether there is a universal hierarchy of the senses, stimuli from the five basic senses were used to elicit descriptions in 20 diverse languages, including 3 unrelated sign languages. We found that languages differ fundamentally in which sensory domains they linguistically code systematically, and how they do so. The tendency for better coding in some domains can be explained in part by cultural preoccupations. Although languages seem free to elaborate specific sensory domains, some general tendencies emerge: for example, with some exceptions, smell is poorly coded. The surprise is that, despite the gradual phylogenetic accumulation of the senses, and the imbalances in the neural tissue dedicated to them, no single hierarchy of the senses imposes itself upon language.
Seeing a face being touched in spatial and temporal synchrony with the own face produces a bias in self-recognition, whereby the other face becomes more likely to be perceived as the self. The present study employed event-related potentials to explore whether this enfacement effect reflects initial face encoding, enhanced distinctiveness of the enfaced face, modified self-identity representations, or even later processing stages that are associated with the emotional processing of faces. Participants were stroked in synchrony or asynchrony with an unfamiliar face they observed on a monitor in front of them, in a situation approximating a mirror image. Subsequently, event-related potentials were recorded during the presentation of (a) a previously synchronously stimulated face, (b) an asynchronously stimulated face, (c) observers' own face, (d) filler faces, and (e) a to-be-detected target face, which required a response. Observers reported a consistent enfacement illusion after synchronous stimulation. Importantly, the synchronously stimulated face elicited more prominent N170 and P200 responses than the asynchronously stimulated face. By contrast, similar N250 and P300 responses were observed in these conditions. These results suggest that enfacement modulates early neural correlates of face encoding and facial prototypicality, rather than identity self-representations and associated emotional processes.