Constructing spatial perception through self-touch

Published: Nov. 22, 2020, 4:03 a.m.

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.11.21.392563v1?rss=1 Authors: Cataldo, A., Dupin, L., Dempsey-Jones, H., Gomi, H., Haggard, P. Abstract: Classical accounts of spatial perception are based either on the topological layout of sensory receptors, or on implicit spatial information provided by motor commands. In everyday self-touch, as when stroking the left arm with the right hand, these elements are inextricably linked, meaning that tactile and motor contributions to spatial perception cannot readily be disentangled. Here, we developed a robot-mediated form of self-touch in order to decouple the spatial extent of active or passive movements from their tactile consequences. Participants judged the spatial extent of either the movement of the right hand, or of the resulting tactile stimulation to their left forearm. Across five experiments, we found bidirectional interference between motor and tactile information. Crucially, both directions of interference were stronger during active than passive movements. Thus, voluntary motor commands produced stronger integration of multiple signals relevant to spatial perception. Copy rights belong to original authors. Visit the link for more info