Abstract
Movements towards touch on the body require integrating tactile location and body posture information. Tactile processing and movement planning both rely on posterior parietal cortex (PPC) but their interplay is not understood. Here, human participants received tactile stimuli on their crossed and uncrossed feet, dissociating stimulus location relative to anatomy versus external space. Participants pointed to the touch or the equivalent location on the other foot, which dissociates sensory and motor locations. Multi-voxel pattern analysis of concurrently recorded fMRI signals revealed that tactile location was coded anatomically in anterior PPC but spatially in posterior PPC during sensory processing. After movement instructions were specified, PPC exclusively represented the movement goal in space, in regions associated with visuo-motor planning and with regional overlap for sensory, rule-related, and movement coding. Thus, PPC flexibly updates its spatial codes to accommodate rule-based transformation of sensory input to generate movement to environment and own body alike.
Similar content being viewed by others
Introduction
We frequently make reaching movements to tactile stimuli, for example to scratch an itch, move a hair away from our face, or to brush off an insect that crawls along our arm. We perform these movements with ease even though the brain must perform complex computations to successfully complete them. Tactile stimulation excites receptors in the skin, and thus tactile location is initially coded relative to the skin’s layout as is evident, for instance, in the homuncular organization of primary somatosensory cortex (S1)1,2. Yet, the skin is but a 2D sheet wrapped around our 3D body which, in addition, constantly changes its layout when the body parts move. Therefore, converting touch into a spatially-guided motor act presumably requires neural transformations.
The nature of these transformations between different spatial codes has been debated. A popular conceptualization draws tight parallels between visuo-motor and tactile-motor processing. In this view, planning movement towards a touch involves the computation of tactile location from skin-based, anatomical coding into a 3D spatial code –referred to as remapping of touch from a somatotopic or anatomical into an external(-spatial) reference frame– based on the integration of body knowledge and posture information3,4,5,6. A potential consequence may be that visual and tactile stimuli are represented in the same spatial code, which may afford direct integration of stimuli from the different modalities as well as common further processing regardless of the original sensory modality of the cue7.
Under this premise, research based on visuo-motor paradigms is an obvious starting point for establishing the principles that underly tactile-motor processing. Studies on visuo-motor transformation have established that PPC encodes visual targets in 3D reference frames where visual objects are coded relative to a specific body part such as the eyes (or gaze direction), head, torso, or a hand, and sometimes in a combination of such codes8. Posterior PPC regions reportedly code hand reaches in a gaze-centered reference frame9,10,11,12,13,14,15,16 while anterior PPC regions encode them in a hand-centered reference frame17,18,19,20,21,22,23,24. In the macaque ventral intraparietal cortex, visual receptive fields are linked to tactile receptive fields, such that the visual space relevant to a given neuron changes when the monkey moves the body part to which the tactile receptive field is linked25,26,27. This coding implies that visual space is flexibly mapped to body locations based on postural information. However, the reference frames employed by parietal regions are often not fixed but can vary depending on the available information or task requirements10,28,29,30. Thus, spatial coding in this cortical region is dynamic, and such dynamics have been demonstrated not only between different contexts but also during the progression of single trials, marking PPC as a key region of sensorimotor transformation.
In the case of touch, the somatotopic organization evident in S1 extends into the anterior regions of PPC in both macaques31 and humans32,33. In contrast, magneto- and electroencephalographic brain signals over parietal cortex are modulated by both the somatotopic and external-spatial location of tactile stimuli34,35,36, yet the spatial resolution of these methods limits conclusions about which brain regions use the respective spatial codes. Generally speaking, somatotopic coding appears to occur more anteriorly than external coding. Transcranial magnetic stimulation (TMS) has also provided evidence that the external location of touch affects tactile processing in PPC37,38,39. Yet, even if the relevance of somatotopic and external-spatial coding for touch in PPC appears established, it has remained elusive which parietal regions employ which code and how transformations between them occur.
There are further differences between movement towards tactile targets as compared to movement to visual targets. During reaches towards a visual target, body-related processing is primarily concerned with coordinating the reaching effector for the movement. In contrast, reaching towards a tactile stimulus is more complex in that it requires processing body information about the movement target, as well as relating the tactile target and moving effector, with both belonging to the same body. One puzzle about this double role of the body being the movement target and the moving entity is that touch, proprioception, and movement processing all recruit the posterior parietal cortex (PPC). To date, there is little knowledge about how these sensory and motor roles of PPC are integrated during self-directed actions.
Anterior PPC is involved in coding posture and body configuration in the context of reaching tasks without vision40,41,42. Moreover, a frontoparietal network, including a wide area of PPC, encodes proprioceptive reach targets in a body-centred code10,29. It is noteworthy, however, that paradigms investigating movement planning to proprioceptive targets usually name the target body part (e.g., “move to the tip of the left index finger”). Therefore, this type of task does not involve any transformation of tactile information from skin into space or the mapping of touch to a body part. In other words, such tasks test only one aspect of somatosensation, namely proprioception, but do not speak to the processing of touch. Studies that have addressed touch have demonstrated that PPC mediates the sensorimotor transformation of tactile targets on the hands34,43,44. However, the respective studies did not dissociate sensory and motor spatial information and so it remains unclear whether the observed responses were related to maintained sensory processing or to movement planning.
We have so far addressed direct spatial transformations, which transform the location of a sensory stimulus into a code that guides the motor system to that location. Such transformations mediate directly between sensory and motor reference frames. However, parietal cortex is also involved in deriving movement plans from arbitrary, abstract sensory cues45,46,47. In such paradigms, the sensory stimulus and movement goal are not identical in their location. For instance, movements depend on rules when a movement must not be directed to a visual cue itself but, instead, a different movement goal must be inferred based on the cue’s location. A typical paradigm is the so-called anti-movement task, in which participants must plan a response to a location opposite to that of a (typically visual) cue48,49,50,51,52. In this case, the movement goal location is at a different spatial location than the sensory cue, which allows brain responses related to sensory processing and movement planning to be distinguished. In this type of task, the location of the sensory cue is still relevant for solving the task, even if the resulting motor response is not directed towards it. In line with these requirements, experiments that employed this paradigm have established that PPC neurons first encode the location of a visual cue, but later switch to encoding the location of the movement goal that derives from it49,53. Human PPC also exhibits such dynamic coding, with responses to visual cues before the movement goal is specified, but representing movement plans once the reach goal has been specified50,54,55,56,57. In sum, neuronal responses in PPC can reflect the evolution of a spatially specific motor plan based on sensory information and task instruction. Accordingly, parietal computations go beyond mere transformations of spatial location between different reference frames and are flexibly adapted to the current task or behavioral goal.
With the present study, we attempted to approach tactile-motor transformation in a manner that allows for close comparison to well-established findings on visuo-motor transformation. We mapped the emergence of information related to skin-based and external-spatial codes, as well as to the task rule, during a tactile-motor task. We recorded fMRI while human participants planned and executed hand pointing movements to their feet, which were positioned either uncrossed or crossed. A given tactile stimulus is located on the opposite side of space with uncrossed vs. crossed feet; therefore, this manipulation allowed us to dissociate anatomical and external-spatial coding of tactile targets. Moreover, we employed an anti-pointing task rule, which instructed participants to point to the equivalent location to the one that had been touched, but on the other foot, to differentiate tactile sensory processing from motor-related, sensorimotor transformation. Therefore, all movements had to be derived from tactile cues and were directed towards the own body.
Results
Experimental setup and analysis rationale
Figure 1 illustrates the trial design. Each trial was separated into four intervals, beginning with a fixation interval. Participants were then presented with a tactile stimulus on their left or right foot, followed by a delay, forming the touch localization interval (duration: 1-4 TR = 1880–7520 ms). There were two stimulus locations on each foot, located medially and laterally on the back of each foot, approximately two centimeters below the heads of the metatarsal bones. We explicitly instructed participants to plan and execute precise pointing movements. This experimental strategy matches that of previous studies, in which pointing movements were directed to visual stimuli at different spatial locations58,59. It discourages participants from making stereotypical responses to one side or the other, as it requires precise spatial motor planning. Participants learned the present trial’s task rule via a visual cue only after the touch localization interval: either a right-hand pointing movement toward the remembered stimulus location (pro-pointing) or an anti-pointing movement to the homologous location on the other foot. Thus, the target was always a specific location on the own body. This instruction was followed by another delay, forming the movement planning interval (duration: 1-4 TR = 1880-7520 ms). A cue at the end of this interval prompted movement execution. Because the required movement was specified only after the touch localization interval, the final required movement was unknown during this time interval, and therefore we expect that activation in the touch localization interval is related exclusively to stimulus processing. In contrast, activation in the movement planning interval may be related not only to movement planning, but may also retain spatial information about the stimulus. Participants were highly accurate in pointing movements for trials in which hand movement tracking could be assessed (66% of trials), with a mean accuracy of 92% in making movements to the correct location. A 2 (left vs right tactile stimulation) x 2 (uncrossed vs crossed posture) x 2 (pro vs anti task) within-subjects repeated measures ANOVA showed no main effect of tactile stimulation location (F1,15 = 2.75, p = 0.12, ηp2 = 0.15), posture (F1,15 = 0.69, p = 0.42, ηp2 = 0.04) or task rule (F1,15 = 0.05, p = 0.83, ηp2 = 0.00) on task accuracy, or any interaction between the three factors. Thus, participants were similarly accurate across conditions.
Multi-voxel pattern analysis (MVPA) was performed to assess whether a classifier trained to dissociate fMRI voxel activation patterns observed during one trial class (e.g., trials with a tactile stimulus at the left foot) vs. another trial class (e.g., trials with a tactile stimulus at the right foot) can predict the correct class label of new activation patterns that were not used during training. We used a searchlight procedure60 that decodes patterns of voxels contained in a sphere with a 4-voxel radius around a center voxel. Each of the brain’s voxels serves once as the center voxel, so that the procedure generates a brain-wide map of decoding accuracy. When decoding performance is significantly above chance, the sphere around the respective center voxel contains differences in the patterns of neural responses that can distinguish between the two tested classes. Accordingly, we interpret significant decoding as indicating that the respective region encodes information related to the tested task characteristic.
Classification analyses devised for the present study are summarized in Fig. 2 and specified in detail in Fig. S1. We additionally performed univariate analyses to test whether any regions showed overall differences in activity between our main conditions of interest (see Supplemental Information).
Anatomical and external spatial coding are present during tactile-sensory processing
We first tested whether information about anatomical and external touch location was present in activity patterns during the touch localization interval, when sensory information was available but the movement goal was not yet specified. Anatomical touch location identifies which of the two feet, right or left, received stimulation, independent of where the respective foot was currently placed. The conditions contrasted in our MVPA decoding are illustrated in Fig. 2A and Fig. S1A (left side, within-interval classifier 1).
Anatomical information was present in multiple parietal regions (Fig. 3A and Table 1): the medial bank of primary somatosensory cortex (S1), bilaterally; the lateral right inferior parietal lobule (IPL), bordering secondary somatosensory cortex (S2) and spreading to lateral S1 and primary motor cortex (M1); and the left anterior SPL. Decoding was also successful outside parietal cortex, in the right PMd and the left insula, spreading into the superior temporal gyrus. The group mean, above-chance decoding accuracy ranged from 53.8–55.0%, with participant-level confidence intervals ranging from 0.3% to 2.0% (Fig. 3B). These classification values are in a range that is typical for MVPA decoding55,61,62,63,64,65.
External touch location identifies on which side of external space, the right or left side, a stimulus occurred, independent of which foot the stimulus was applied to. Foot crossing dissociates external from anatomical location as, for example, both the uncrossed left foot and the crossed right foot are located on the left side of space. The conditions contrasted in MVPA decoding are illustrated in Fig. 2B and Fig. S1B (left side, within-interval classifier 2). External touch location information was present in a single, right-lateralized cluster confined to the medial IPS (Fig. 4A and Table 1). The group mean, above-chance decoding accuracy was 54.4% with participant-level confidence intervals ranging from 0.4–1.4% (Fig. 4B).
Tactile-sensory spatial codes are maintained only as long as necessary
Previous research has suggested that neurons change their spatial tuning during sensorimotor processing53,54. Therefore, we next tested whether anatomical and external spatial information about the tactile stimulus remained stable across touch localization and movement planning intervals. We used the classifiers trained in the touch localization interval to classify voxel patterns in the movement planning interval66. Successful cross-interval classification would suggest that tactile stimulus coding was maintained across both touch localization and movement planning intervals. In contrast, a failure of cross-interval classification would imply a change in the representational format, as evident in voxel-wise brain activity, across the trial phases.
The conditions we pooled in these analyses are illustrated in Fig. S1 (anatomical location information: Fig. S1A, middle, cross-interval classifier 3; external location information: Fig. S1B, middle, cross-interval classifier 4). These classifiers did not identify any regions in which decoding of touch location across trial phases was above chance. We additionally tested cross-classification of sensory coding separately for pro- and anti-pointing conditions, as it would be possible that a default motor plan could be made in the tactile localization interval towards the target, that would then be remapped for anti-pointing trials upon receiving the task cue. We did not identify any regions that allowed decoding touch location across trial phases for either pro-pointing or anti-pointing trials, suggesting that participants did not prepare a default motor plan to the tactile stimulus during the tactile localization interval. Altogether, the results of these analyses suggest that spatial coding differs between the two trial phases for all regions.
However, even if coding patterns changed from one trial phase to the next, it would still be possible that stimulus-related, anatomical information or external, spatial information is encoded via a different neuronal firing pattern or in different neurons and, thus, expressed in different voxel patterns. To test whether tactile-sensory spatial information is retained during the movement planning interval with an altered coding strategy, we trained classifiers to differentiate anatomical and external target locations with data from the movement planning interval and tested them on their ability to predict the target locations of test data from the same interval (anatomical: Fig. S1A, right side, within-interval classifier 5; external: Fig. S1B, right side, within-interval classifier 6). Neither classifier identified significant decoding in any region of the cortex. Altogether these classification results indicate that once participants could prepare the movement, sensory information was no longer retained, or was retained at a level too low to be detected by our classification analysis – and, thus, at a level lower than that during the touch localization interval.
In sum, PPC appears to maintain tactile-sensory spatial information only as long as necessary and discard it, or massively reduce its representation, once it can transform sensory information into a motor response. Notably, the two spatial codes were prevalent in distinct PPC regions, without any regional overlap.
Tactile-motor planning recruits a similar network of fronto-parietal regions as visuo-motor planning
Having established that sensory spatial information is decodable only in the touch localization interval, we next tested which brain regions contain information about the tactually defined spatial location of the movement target during the movement planning interval. The movement goal location identified the location in external space (right or left side) to which a hand pointing movement would be directed; we pooled across all possible stimulus locations on a given foot and decoded only left and right side of space, rather than the exact spatial location. Figure 2C illustrates how the combination of foot posture, stimulus location, and pro vs anti-movement dissociates the location of the movement goal from both anatomical and external stimulus locations in our classification analysis. For example, a pro-pointing movement to the left uncrossed foot, an anti-pointing movement to the left crossed foot, and a pro pointing movement to the right crossed foot all require an identical, left-directed pointing movement but do not consistently share anatomical or external stimulus locations, nor their task rule.
MVPA classification of movement goal location during the movement planning interval (right vs. left movement target, Fig. S1C, within-interval classifier 7) identified widespread above-chance classification in parietal areas, including bilateral S1, M1, and SPL, as well as bilateral frontal areas PMd and SMA (Fig. 5 and Table 1). Above-chance classification was identified in more extensive regions in the left hemisphere, the hemisphere contralateral to the effector that executed the pointing movement. Here, above-chance classification was also evident in parietal cortex along the IPS and in the parietal occipital sulcus (POS), and in frontal cortex in the pre-SMA. In the right hemisphere, a small additional cluster was identified in the occipitotemporal cortex. The group mean above-chance decoding accuracy ranged from 54.1–58.4% in the left hemisphere and from 53.8–58.0% in the right hemisphere, with participant-level confidence intervals ranging from 0.3–2.4% and from 0.3–2.6% respectively (Fig. 5B). In a univariate analysis that identified regions more active for right than left pointing targets, we identified clusters in the bilateral SMA and the left lateral and medial anterior SPL, M1 and the PMd that showed stronger activation for planning pointing movements to right targets as compared to left targets (Fig. S3).
The vast, bilateral responses related to the goal location during the movement planning interval differed markedly from the more regionally confined responses related to sensory spatial information. This is consistent with previous reports of stronger PPC activation during motor planning than sensory processing in sensorimotor delay paradigms such as the current one58,67, underlining the prominent involvement of PPC in motor planning and control49,68.
PPC encodes the current task rule
Our task required participants to interpret the task rule, i.e., pro- vs. anti-pointing, to derive the required movement from the tactile location. Participants showed similar mean task accuracy for both pro- and anti-pointing (both 92% correct). In the macaque PPC, the task rule is encoded independent of specific sensory cues69, and it modulates neuronal firing in the areas that encode the movement goal49. Human fMRI studies did not find differences in univariate activation between preparation of pro- and anti-pointing movements50,54, nor were they able to decode pro- vs. anti-pointing from regions of interest in the SPL, aIPS or PMd55. We tested whether we could decode the task rule in our tactile-motor task. Figure 6 illustrates the performance of a classifier trained to dissociate pro- and anti-pointing movements during the movement planning interval (Fig. S2, within-interval classifier 8). Average ROI coordinates across participants are displayed in Table 1. Task rule could be decoded above chance-level bilaterally in the SPL and in the left superior parieto-occipital cortex (SPOC). The mean above-chance decoding accuracy of the task rule per ROI ranged from 53.8% to 54.1% with participant-level confidence intervals covering ranges from 0.3% to 1.3% (Fig. 6B).
From touch localization to sensorimotor planning: functional overlap
Our results demonstrate that PPC contains anatomically and externally coded tactile location information during the touch localization interval and information about the motor goal and task rule during the movement planning interval. We next explored whether there was overlap between the regions that encoded sensory and motor-related spatial information in the two intervals. Such an overlap would provide evidence for a dynamic change in spatial coding within areas across the duration of a trial, likely implying that the same regions are involved in the underlying transformation between the different spatial codes over the course of sensorimotor processing.
Figure 7 displays the overlap between clusters in which our classifiers identified sensory and motor-related spatial coding. There is considerable overlap between these spatial codes across trial intervals. 34% of the voxels that encoded the anatomical stimulus location during the stimulus localization interval overlapped with voxels identified as coding movement goal location in the movement planning interval. These overlapping voxels were located in S1 bilaterally, as well as the left SPL and the right PMd (Fig. 7A; overlap: 221 voxels = 5967 mm3). 61% of the voxels that encoded external stimulus location during the stimulus localization interval overlapped with voxels that coded movement goal location. These voxels were located in the right mIPS (Fig. 7B; overlap: 74 voxels = 1998 mm3). Thus, a considerable proportion of voxels participated in coding spatial stimulus information in one reference frame in the first trial interval, and spatial movement goal information in a different reference frame in the second trial interval.
Furthermore, 90.4% of the cluster representing the task rule overlapped with the cluster representing movement goal locations bilaterally in the SPL and in the left POS (overlap of 422 voxels, volume: 11,394 mm3; Fig. 7A). In the left SPL, located between voxels common to anatomical and movement goal location and voxels common to goal location and task rule, was a voxel overlap of all three types of information (overlap of 34 voxels, volume: 1161 mm3; Fig. 7A). This finding implies that the prevalent information coded in these regions, as decodable with MVPA, varies over the course of the trial and, thus, that the neural function of these regions changes over time: PPC regions that coded tactile location before a movement plan could represent the movement goal once the task rule had been specified. Moreover, areas that carried information about the task rule were almost completely enclosed within the cluster that coded movement goal location, suggesting that this bilateral PPC region performs a sensorimotor transformation based on stimulus location and task rule, to turn this information into a movement goal.
The results of decoding anatomical touch location, external touch location, movement goal location and task rule presented here were robust against variations of the analysis pipeline and model specification. We re-ran separate instances of the presented analyses to address several aspects that could potentially bias our results (see Supplementary Information). First, we employed an unwarping procedure that included removal of motion artifacts and, therefore, did not include motion regressors in our GLMs. Re-running our analyses with motion regressors included did not result in any notable differences to the results we report here (see Fig. S4). Second, we had included trials in our analyses even if we had been unable to extract the finger movement response, for example due to knee or leg posture obstructing the view of the hand. Re-running our analyses without such trials rendered very similar results to the ones we report here (Figure S5). Third, we had modelled each trial phase for its true duration of 1-4 TRs. While common for delayed-movement paradigms such as the one we use here50,58,67,70,71, this procedure favors sustained activity over transient responses. Re-running our analyses with a 1-TR duration for each trial phase revealed mostly comparable results as our main analysis (see Figs. S6 & S7). A notable difference, however, was that external touch location in mIPS was evident bilaterally rather than unilaterally.
Discussion
We investigated the reference frames involved in tactile-motor transformation during pointing to tactually indicated locations on the feet. We dissociated anatomical and external-spatial coding of tactile stimulus location by manipulating whether participants’ feet were uncrossed or crossed, and further dissociated between sensory and motor-related responses using a delayed anti-task paradigm. We report three key findings. First, PPC exhibited concurrent anatomical and external-spatial coding of tactile stimulus location in different PPC regions during the touch localization interval. Anterior regions including primary somatosensory cortex and SPL encoded stimulus location in an anatomical reference frame, whereas medial IPS, located in the posterior PPC, encoded stimulus location in an external spatial reference frame. Second, spatial coding was dynamic, showing responses related to stimulus location prior to presentation of the task rule, and then responses encoding the motor goal location following presentation of the task rule. There was overlap between regions encoding first sensory and later motor information, and one region in SPL coded the original anatomical stimulus location, the task rule, and the resulting movement goal. Thus, this region may be central in transforming tactile information into an actionable spatial target. Third, coding of the movement goal was present in a large network, consistent with regions shown to be involved in motor responses to visual targets, suggesting similar coding of movements to tactually and visually defined targets.
Anterior PPC encodes touch in an anatomical reference frame
We identified regions that use an anatomical code for tactile location by testing which regions could decode the foot on which the tactile stimulus was received, regardless of crossed or uncrossed foot posture. Decoding was above chance in the SPL, a region in the anterior PPC, as well as in a frontoparietal network of regions known to be involved in touch processing. Previous work has demonstrated that anterior PPC contains regions that show selective responses to tactile stimulation on different body parts26,31,32,72. In particular, a human fMRI study identified a roughly homuncular tactile map in the SPL and anterior IPS that also overlapped with retinotopic visual maps32,72. Tactile leg and toe regions in this map lay medially, in close agreement with the SPL cluster for anatomical foot location found in the present study (Fig. 8: yellow sphere number 1). Here, we report that this region uses an anatomical code for tactile stimuli, that is, location is coded regardless of the current position of the limb in external space. This anatomical stimulus coding in anterior PPC may not be limited to tactile sensation. Human fMRI studies have identified coding of visual stimuli relative to the position of the body in anterior PPC19,20,23 (Fig. 8: cyan spheres), suggesting that anterior PPC may encode multisensory stimuli in relation to the body’s layout or to the skin. This finding extends the suggestion that rostral PPC “projects” the environment onto the body and estimates the current state of the environment by transforming information to match with the own body26,73.
We decoded anatomical tactile stimulus location bilaterally in the medial bank of bilateral S1, adjacent to the SPL. This is consistent with the role of this region as the part of the somatosensory homunculus that responds to contralateral foot stimulation1,2,74. Regions beyond S1 also exhibited sensitivity to the anatomical location of tactile stimuli, including the left insula and the right IPL/S2, M1/S1 and PMd. Previous studies also reported that these regions respond to tactile stimuli75,76,77, but did not specify the reference frame used to encode these stimuli. Viewed together, these previous and our present results identify a network of brain regions that encode tactile sensory location in an anatomical reference frame.
Posterior PPC encodes tactile location in an external-spatial reference frame
We identified a region in the mIPS that encoded tactile stimulus location in an external-spatial reference frame. This coding implies that the location has been abstracted from the skin surface: a given location is coded relative to where it is in space, no matter on which body part or skin location it was originally received. This external-spatial reference frame may be gaze-, head-, trunk- or hand-centered. We did not target these different possibilities in our study, and accordingly did not vary gaze, head, body and hand position but instead kept all four aligned in our experimental setup.
Recent experiments have cast some doubt on the idea that touch is truly recoded into an external-spatial code78,79. These studies point to a dissociation between the assignment of a tactile stimulus to the limb on which it occurred and its 3D external-spatial location. They suggest that touch is automatically associated with the space that the touched limb usually resides in – such as the right side of the body for the right arm and hand – and that the true external-spatial location of the touch is not derived automatically, but only when required. This conceptual difference may appear subtle: whether a tactile stimulus is directly associated with a location in 3D space, or whether it is first assigned to a body part and that body part is then localized in space, it is always posture information that must be integrated with the skin location of the tactile stimulus. Yet, the second view implies an important difference between visual and tactile processing, because it allows for the possibility that the tactile stimulus itself is never coded in 3D space. A movement towards a touch may, instead, involve referring the touch to a limb, identifying where the limb is in space, and then planning a movement to a location on the limb80.
The activated mIPS region identified in the present study partially overlapped with a region previously reported as a putative human homologue of the lateral intraparietal area (LIP), which has been associated with eye-centered coding15,56,81,82,83 (Fig. 8: green spheres). LIP encodes both visual and auditory targets in an eye-centered reference frame84,85,86 and it was therefore proposed that LIP generally encodes stimuli of all modalities in an eye-centered reference frame13. Our results are consistent with this proposal. Whereas LIP is often considered to be specific to saccade planning, it has also been suggested to provide a sensory priority map87,88. Furthermore, LIP neurons are active during reaching movements87,89. This finding fits with reports that putative human homologue of LIP responds to both saccades and reaches15,58,59,90,91. Thus, a contribution of LIP to our present pointing task is consistent with previous findings, especially given that we observed partial overlap between mIPS voxels initially coding tactile stimuli in an external-spatial reference frame and later responding to the movement goal. Overall, LIP’s role in our delay task may be to first maintain salient spatial target information and later to transform it into a motor goal once participants receive the pro/anti task rule.
However, the mIPS region we identified in our study also partially overlapped with hVIP#1, a region that we have proposed to be one of three areas that together form the human homologue of macaque VIP26. Previous findings suggest that hVIP#1 encodes the location of sensory stimuli in an external-spatial reference frame92, which again fits well with our present results. In sum, mIPS appears to encode tactile information projected into an external-spatial code that is independent of the skin and is likely anchored to the eyes.
Further support for a role of the LIP/ hVIP#1 region identified here in external-spatial coding of touch location has come from several TMS studies. Figure 8 depicts the respective stimulation locations; three sites fall within hVIP#138,39,93, and one site lies more posteriorly, close to the coordinates of the putative human LIP37. TMS to these locations impaired participants in making judgments that required transforming tactile location from the skin into space37, appeared to impede, or modulate, the integration of arm posture in the context of tactile-spatial processing38,39, and impaired the integration of auditory cues with tactile processing in external-space93. Despite their locational variability, a common aspect of all these studies was that TMS resulted in tactile stimuli being processed as if they had been assigned to the correct limb but had not been referred to the location in space at which that limb currently lay but, instead, to where that limb is normally located. These findings have typically been interpreted as indicating that posterior PPC plays a role in remapping tactile stimuli from an anatomical to and external-spatial reference frame. However, they are also in line with the idea we introduced earlier, namely that touch is referred to a limb and it is the location of that limb that is tracked in space, rather than the touch itself.
Decoding touch in space was only possible in the right hemisphere in our study. This finding is consistent with the proposal that right PPC may be specialized for spatial somatosensory function, as somatosensory deficits related to spatial processing are more frequent following lesions in the right PPC as compared to the left PPC94,95. However, it appears inconsistent with the widely held idea that each hemisphere of PPC represents mainly the contralateral side of space, and findings that both putative human LIP and hVIP#1 are bilateral regions15,26,56,81,83. The solution to this apparent contradiction may be resolved by the results of our re-analysis that was geared towards transient, short-term responses by modelling only 1 TR after tactile and task rule cues. This analysis decoded external-spatial touch in bilateral posterior PPC (see Figure S6). This result suggests that LIP/hVIP#1 use a distributed spatial code for sensory processing, but that the right hemisphere carries some specialized functions with regard to rule-based integration of this sensory location, which is caught by the interval-long predictors in our analyses. This reasoning may also explain why fMRI responses related to tactile remapping in PPC were stronger in the right hemisphere in one study96, but in the left hemisphere in another study97. TMS studies on tactile remapping have often targeted only the right hemisphere, making it unclear whether their findings would generalize to the left hemisphere37,38,93; however one study that stimulated both the left and right hemisphere found similar effects for both hemispheres39. In sum, the available evidence across fMRI and TMS studies favors the view that spatial representation in posterior PPC is distributed and bilateral, but that lateralization exists for some higher-order, rule-based processes.
Two possible views of how touch is remapped
The overall picture that emerges is that anterior and posterior PPC regions play opposite roles in coding tactile stimulus location. Anterior regions relate the environment to the body and, thus, code space anchored to individual body parts. In contrast, posterior regions relate the body to the environment and code information in an external-spatial reference frame, potentially anchored to the direction of gaze (see ref. 73). Thus, we concurrently decoded touch location in anatomical and external-spatial reference frames across PPC. Modelling98, behavioural99,100,101, electrophysiological35,36,102 and magnetoencephalographic34,43 studies have demonstrated that tactile information is encoded in multiple, concurrently active, reference frames. Extending these previous findings, our study disentangled the respective involved, parietal brain regions at high spatial resolution.
Our study was not designed to resolve whether it is touch itself, or the touched limb, which is localized in 3D space for the pointing response. The presence of the external spatial location coding in the sensory phase supports the view that the tactile stimulus itself was coded in space. Yet, the present task instructions explicitly required the use of the external-spatial location of touch. Therefore, representation of tactile location already during the sensory trial phase, in which movement was not yet specified, may have been induced by the general task requirements and may not have occurred in the absence of such a task. In this case, the external information we decoded could be related directly to the external location of the tactile stimulus, or to the location of the limb to which the tactile stimulus was assigned. PPC responds to changes in body posture40,41,42,103, suggesting that PPC probably also encodes information about current body posture. We did not directly address the coding of body posture – crossed vs. uncrossed feet – in the present study. This is because we manipulated posture across runs, so that any differences between postures identified by MVPA could be attributable to unknown differences between runs, rather than limb crossing per se.
Finally, it is conceivable that participants could have developed a default plan for a pro-pointing movement during the stimulus presentation phase of the trial, that is, before they knew whether they would have to pro- or anti-point. Such a default plan could be maintained both for a recoded tactile location and for a derived target limb. It has been proposed that movement choice tasks could involve planning for one of the available target options during a delay period, and that this default plan could then be switched if the final cue requires using the non-prepared movement104. Nevertheless, we deem it unlikely that a default motor plan accounts for our present results. The activation during the movement planning phase of trials was much more extensive than that during the sensory phase, during which a default plan should have been evident if it existed. Furthermore, we found no evidence that coding was maintained across sensory target localization and motor planning intervals. Recordings from neurons in macaque LIP also support this notion: macaques delayed forming their decision until the final movement plan could be made, even when sensory decision information was already available105. Thus, altogether PPC appears to flexibly change its coding from sensory processing to motor preparation.
Goal-directed pointing recruits similar networks independent of target modality
Once both target and task rule had been defined and participants could plan the required action, a large network spanning multiple regions in premotor cortex, primary sensorimotor cortex, and PPC selectively represented spatial information about the movement goal. This network closely corresponded to the frontoparietal network involved in visuo-motor planning59,106,107,108,109,110,111. Although regions of both hemispheres contained movement planning information, a more extensive area of the left, than right, hemisphere could decode the movement goal, and decoding accuracy was also higher in the left hemisphere. This is likely because participants pointed with their right hand. Regions coding the movement goal location included SPOC, the putative human homologue of the parietal reach region, and PMd; these two regions are thought to be involved in calculating the movement path from effector to target location67,91,112,113. The movement planning network also spanned the M1 hand area, SMA and pre-SMA, which are known to code preparatory movement signals related to the effector and target, as well as object-related movement intentions61,63,114,115,116. Lastly, the movement planning network also encompassed an anterior IPS region that has been linked to the preparation of grasping and pointing movements10,58,71,117,118. Altogether, these results identify a broad network of regions that encode the tactually-defined movement goal location and suggest that movement planning is very similar across sensory modalities of the target stimuli.
In natural behavior, movements in response to tactile stimuli often involve directly touching the stimulated location or limb. For example, one might brush the right hand across the left arm after feeling an insect crawling along it. Our study differs from such natural situations and behavior in some regards. First, in our study, participants pointed towards, rather than moved to, the locations indicated by touch. fMRI work has compared responses for pointing and reaching and observed similar activation for both in a frontoparietal network consistent with the regions in which we decoded the pointing movement goal in the present study119. Notably, touching the own skin gives rise to additional somatosensory feedback that has been shown to improve tactile localization120, and there can be attenuation of somatosensory signals due to predictive mechanisms121,122. Second, targeting the own body with a movement often involves moving not only the acting limb, but also the target area. In the above example, the right hand (actor) and the left arm (target) would probably both move to meet in front of the torso–an area primates prefer for both manual manipulation and viewing of objects123. These considerations show important differences between visuo-motor and tactile-motor planning and highlight exciting areas for future research. However, our aim with the present study was to use an experimental design that was as similar as possible to visuo-motor research paradigms. In such paradigms, participants can only move the acting effector, but not the target, and experiments have used pointing and reaching responses alike to explore the transformation from a sensory to motor code.
Sensory representations are only maintained as long as necessary
We were unable to decode sensory location information in the part of the trial in which the task rule had been specified and participants were, thus, able to plan the pointing movement. Classifiers trained to decode sensory location during the touch localization interval did not generalize to the movement planning interval, nor was decoding successful when we attempted to decode sensory location by both training and testing classifiers on responses during the movement planning interval. These findings suggest that sensory representations were only maintained for as long as necessary. Compatibly, previous fMRI work investigating visuo-motor control has also demonstrated that sensory codes are only maintained as long as they are relevant in most regions54,55, and recordings from PRR neurons in macaques have shown only brief, transient encoding of the visual target location when the movement cue is already known49. Our task required a movement of the effector to a stationary tactile target, which may have aided the loss of sensory coding once the movement could be planned. Different results may be obtained when either the tactile target moves along the body, or when participants execute a movement that involves both the acting and target limbs, as discussed above. However, such considerations are in line with an interpretation that sensory codes are maintained as needed.
In the present study, brain regions that first encoded sensory information overlapped with those that later encoded movement planning (Fig. 7). This finding suggests that a dynamic sensorimotor transformation occurred in these regions once information to plan the pointing movement was available. Such overlap of target location and motor goal location across time has also been identified in the PMd and posterior IPS during visuo-motor tasks54, and equivalent dynamics have been identified in the coding of neurons in the macaque parietal reach region49.
Posterior parietal cortex encodes the movement task rule
We decoded the movement task rule (i.e., pro- vs. anti-pointing movement) from bilateral PPC clusters in the SPL and SPOC. The success of previous human fMRI studies that employed an anti-task visuo-motor paradigm to decode the task rule from PPC have been mixed. Some studies did not observe differences in univariate activation evoked during planning of pro- and anti-movements50,54, whereas another study reported regional differences for pro- and anti-pointing movements48. Yet another study was unable to identify regions representing the task rule using a decoding approach55, but used a region of interest approach and may therefore have overlooked task rule encoding in areas outside the inspected regions. In contrast to these human fMRI studies, responses of neurons in monkey PRR were modulated by the pro/anti task rule49, and PPC neurons encoded task rule information independent of specific sensory cues69. In our study, the bilateral SPL and SPOC regions that encoded the task rule almost completely overlapped with regions that encoded the movement goal. The respective left SPL cluster additionally overlapped with the SPL cluster that had previously encoded the anatomical tactile target location. These findings suggest that PPC regions simultaneously encode multiple features at the population level124.
It is of note that the ability to relate sensory cues to arbitrary rules links to the above-discussed framework of limb identification as an intermediate step to tactile-motor interaction. One could view the tactile stimulus as a cue that instructs a movement towards a body part. In this view, tactile-motor transformation is indirect, with the connecting “rule” being a mapping between a body map and space mediated by current body posture.
Dynamic spatial transformations in posterior parietal cortex mediate tactile-motor transformation
Our study reveals the dynamic representation of spatial information during tactile sensorimotor planning. Spatial codes in PPC were characterized by distinct multivariate activation patterns that changed their selectivity from representing sensory information during target localization to representing the movement goal during motor planning. Tactile-spatial sensory information was concurrently encoded in anatomical and external-spatial reference frames. These different codes were evident in distinct PPC locations along an anterior to posterior gradient, consistent with recent proposals that PPC contains poles that relate the environment to the body (anterior) and vice versa (posterior)73. Once the movement task rule was specified, tactile information was no longer detectable, and instead a frontoparietal network encoded the location of the movement goal. Information about the tactile stimulus, movement goal and the movement task rule converged in the left SPL, suggesting this region may play a key role within the network that transforms tactile location into a movement plan. Overall, the posterior parietal cortex integrates sensory information of the different sensory modalities provided in different reference frames and flexibly combines this information to plan movement towards both the environment and the own body.
Methods
Participants
The analyzed sample consisted of 16 students of the University of Hamburg (10 female, 6 male, determined by self-reporting), mean age 23.8 years (range: 19–30 years). Participants were right-handed according to questionnaire-guided self-report125, had normal or corrected-to-normal vision, and reported to be free of any neurological disorders, movement restrictions, or tactile sensitivity problems. Participants provided written informed consent and received course credit or € 8/hour for their participation. Four further participants were excluded from analysis: one had performed >99% of all movements to the incorrect movement goal in the crossed, but not in the uncrossed foot posture, MR slice positioning accidentally omitted part of SPL for two participants, and only partial data was collected due to a technical error for one pilot participant.
The experiment was approved by the ethics committee of the German Association of Psychology (Deutsche Gesellschaft für Psychologie, DGPs, TB 102011 and TB 102011_add_092014).
Behavioral task
Participants planned and executed right hand movements toward tactile stimuli presented on the feet, while blood oxygenation level dependent (BOLD) signal changes in the brain were recorded using fMRI. A delayed movement task separated tactile localization, movement planning and movement execution into separate phases15,58,59,67,91,126. Figure 1 illustrates the four phases of each trial: fixation (duration: 1 TR = 1880 ms), tactile localization on the foot (duration: 1-4 TR = 1880-7520 ms, in steps of 1 TR), movement planning of the right index finger (duration: 1-4 TR = 1880-7520 ms, in steps of 1 TR), and movement execution (duration: 1 TR = 1880 ms). The tactile stimulus occurred pseudo-randomly on the back of the left or right foot, either medially or laterally, approximately two centimeters below the heads of the metatarsal bones. Critically, the tactile stimulus was not informative about the movement that would be required later in the trial: at the beginning of the movement planning delay, a visual cue indicated whether the pointing movement should be planned directly toward the tactile stimulus (pro-movement) or toward its mirrored location (anti-movement). It was only at this point in time that participants could derive the full movement plan. Finally, a visual cue prompted execution of the finger pointing movement. Participants were instructed to point precisely but did not receive feedback on their pointing movement accuracy.
In different experimental blocks, participants assumed either an uncrossed or crossed foot posture. The crossed posture allows anatomical and external locations of tactile stimuli to be dissociated as, for example, a stimulus on the anatomically right foot is then located in the left side of space.
The experiment comprised 3 within-subject factors. Foot Posture (levels: uncrossed vs. crossed) was maintained for 3 runs in a row, with the start posture alternated between sessions and counterbalanced across participants. We chose to minimize posture changes during the experiment to avoid fMRI and movement video recording artefacts due to changes in head and hand position. Stimulated Foot (left vs. right foot; collapsed across the medial and lateral stimulus locations on each foot) and Instructed Movement (pro-movement vs. anti-movement) varied from trial to trial. Trials were distributed into blocks of 33 trials each, with additional rest phases of 11 and 4 TR at the beginning and end of each run, respectively. We minimized dependencies between trials by balancing the trial sequence run-wise such that each condition was followed by every other condition equally often127. These order restrictions required slightly different total trial numbers per experimental condition (48-51 trials) across the entire experiment. We minimized dependencies between the different trial phases within trials by choosing, from 1000 design matrices with randomized delays, the sequence that had the smallest correlations between the predictors in a General Linear Model (GLM) that contained the three factors of our experimental design. We used 4 different randomization protocols (sequence and timing) across the participant sample.
Experimental setup
Participants lay supine in the MR scanner with their head stabilized by foam cushions. The right hand was cushioned into a fixed position on the belly at the left-right body midline, with the index finger extended toward the feet. Experimental instructions were displayed on a monitor, projected onto a translucent screen in the scanner bore. Participants saw the projectors’ image above their head through a mirror mounted on the head coil. Accordingly, participants directed their gaze upward and could not see their hands and feet. An infrared LED was taped to the index finger to record finger movements with a video camera placed in a window between the MR and control rooms. To improve the visibility of the IR LED, cardboard was placed above the right wrist, just below the chest, to shield light emitted from the projector at the end of the scanner bore.
After three blocks of a recording session, the experimenter passively moved the participant’s legs into the required uncrossed or crossed position. Participants chose before the beginning of the experiment which leg should be on top, based on comfort. The crossed posture was fixed using cushions. If necessary for comfort, the top leg was slightly elevated using additional cushions placed below the knee. Direct skin contact of legs and feet in the crossed posture was prevented using clothing and towels to avoid the generation of conductor loops.
Experimental protocols were synchronized with volume acquisition (TR = 1880 ms) and controlled via the software Presentation, version 17.0 (Neurobehavioral Systems, Albany, USA). The fMRI experiment was conducted in two sessions, each about 2 h long including 1 h of preparation and 1 h of scanning. The two scanning sessions were scheduled between several hours and 3 weeks apart. All experimental conditions were spread equally across the two scanning sessions. Participants completed a total of 12 experimental runs of 8 min duration each. Participants rested between runs and continued with the next run whenever they were ready.
Task practice
Participants practiced the task in a regular lab a few days before the first scanning session. Participants lay supine in a reclinable chair, and the visual display was mounted above the head, to mimic the fMRI environment. During practice, we monitored eye movements electrically with an EOG setup. Participants practiced the task until they executed the pointing movement at the correct time during the trial, aimed at the correct location, and kept the hand still at all other times. Furthermore, participants fixated a centrally presented cross throughout the task and practiced keeping the body and the head still during pointing movements.
Tactile stimulation
For tactile foot stimulation, we applied 2 ms electrical pulses through custom-built electrodes attached to a constant current electrical stimulator (DS7A; Digitimer, Hertfordshire, United Kingdom). A manual switch (DSM367; Digitimer, Hertfordshire, United Kingdom) ensured that only a single location could be electrically stimulated at any time. One LED was placed near each of the four switch positions. At any time, one LED was illuminated and indicated the currently required electrode, and an experimenter continually adjusted the switch accordingly.
Before the experiment, the experimenter adjusted the current intensity separately for each electrode. A first electrode was adjusted, starting at 30 mV and increasing in steps of 30 mV until the participant reliably detected stimulation. Afterwards, the experimenter adjusted the threshold across the four stimulus locations until the participant indicated that all four stimuli felt equally strong and could all clearly be detected. We asked participants between runs whether tactile stimuli were still clearly detectable and increased stimulus intensity if it felt weak, or decreased it if stimulation became uncomfortable over time. On average, the selected stimulation intensity was 268.33 mV (range: 60–600 mV).
Eye tracking
Participants had to maintain fixation throughout the experiment to avoid confounding effects of saccade planning and execution. We monitored the right eye’s fixation with an fMRI-compatible eye tracker (Eye Link; SR Research, Ottawa, Canada) operated at a frame rate of 250 Hz. Eye tracker calibration used a 9-point fixation procedure before each scanning session. We identified saccades offline as a 2 s.d. deviation from the trial’s mean eye position that also exceeded 20 pixels (0.5° of visual angle) on the presentation monitor. We excluded trials containing saccades from the fMRI data analysis.
Hand movement tracking
We recorded finger movements through a window that directly faced the scanner bore from the control room, using an IR-sensitive video camera operated at a frame rate of 40 Hz. In addition to the IR LED on the participant’s finger, two color LEDs placed directly in front of the camera signaled the location of the tactile stimulus (left vs. right foot) and the instructed movement (pro vs. anti-movement) on each trial, allowing us to align video with fMRI data offline. We extracted finger position and movement direction (left vs. right, as seen from the participant’s viewpoint) from the video with a custom-made, semi-automated procedure that was based on a combination of cluster detection methods, gradual averaging, and subtraction of images and automatically detected the changes of finger position across frames. Finger movements were assessed for correct left/right direction but not for the specific medial/lateral targets on each foot as it was not possible to clearly distinguish these movements due to the close distance of the medial and lateral targets.
Data selection
We acquired 6303 trials after excluding one run of one participant due to a recording error. We excluded trials if the pointing movement was executed toward the wrong goal location (4.7%), when saccades occurred (10.2%) or when a combination of any of these occurred (0.5%). Hand movement data was unavailable or of low quality such that hand movements could not be determined in 34.0% of trials, for instance due to adverse lighting conditions or elevated leg position. Such adverse conditions often affected an entire run (49.3% of missing trials). The number of trials where hand movements could not be assessed was similar across conditions (range across conditions: 15–19 trials per condition per participant). A 2 (left vs right tactile stimulation) x 2 (uncrossed vs crossed posture) x 2 (pro vs anti task) within-subjects repeated measures ANOVA showed no main effect or interaction of any of the three factors on the number of trials where hand movements could not be reliably assessed.
Eye movement data was unavailable or low quality in 20.8% of trials, usually due to obstruction by the head coil. 7.6% of trials had unavailable or low-quality data for both hand and eye movements. Given the comparably low number of errors in trials for which all data was available, we did not exclude trials due to missing hand and eye data. The final analysis was based on 5396 trials, that is, 85.6% of all recorded trials. The number of trials included in the final analysis was similar across conditions, with an average of 42.2 trials per condition per participant (range across conditions: 41.1–43.1 trials per condition per participant).
FMRI acquisition
We obtained fMRI data with a 3-tesla MR scanner (Siemens, Erlangen) and 32-channel head coil, using an echo planar imaging (EPI) T2*-sensitive sequence that acquired 32 axial slices in descending order (in-plane voxel size: 3 ×3 mm; slice thickness: 3 mm; slice gap: 0.51 mm; TR: 1880 ms; TE: 30 ms; flip angle: 70°; FOV: 216 × 216 mm). fMRI scans covered the whole brain with the exception of the cerebellum and a small ventral portion of the anterior temporal lobe.
Structural MRI acquisition
We obtained a high resolution (1 × 1 x 1 mm) structural MRI image of each participant using a T1* sensitive MPRAGE sequence with 240 slices, for use in coregistration and normalization of functional data.
Preprocessing of imaging data
We preprocessed and analyzed fMRI data with the software Statistical Parametric Mapping (SPM12, Statistical Parametric Mapping; http://www.fil.ion.ucl.ac.uk/spm/), integrated into MATLAB R2015a (The MathWorks, Natick, USA). We discarded the first four volumes of each run to allow for spin saturation. We corrected fMRI data for susceptibility artifacts and rigid body motion by unwarping and alignment to the first image of the first session. Then, we corrected functional images for differences in acquisition time, and co-registered the individual T1 image to the mean functional image generated during realignment.
Multi-voxel pattern classification analysis
We used a multi-voxel pattern analysis (MVPA) approach to determine whether multivariate patterns of brain activation allow classifying different types of spatial codes used during tactile processing and movement planning. This approach involved determining which brain regions differentiate pairs of characteristics, such as left vs. right foot stimulation or left vs. right-side movement goal. Figure 2, Fig. S1 and Fig. S2 illustrate the pairwise comparisons we conducted to decode tactile location (6 classifiers), movement goal location (1 classifier) and task rule (1 classifier).
We constructed participant-level GLMs that were optimized for subsequent classification. Spatial interpolations inherent in normalization and smoothing might degrade meaningful activation patterns128. Therefore, we calculated GLMs in non-normalized, participant space and without smoothing. We accounted for baseline drifts within runs by applying a high-pass filter (128 s), and for serial dependency within runs by using an autoregressive model. GLMs included 23 predictors that modeled experimentally induced variance of the measured BOLD signal in each voxel with delta functions marking the onsets of the particular delay, convolved, in turn, with the canonical hemodynamic response function. There were 2 baseline predictors, one for each foot posture (uncrossed, crossed) that modeled fixation delays at the beginning of each trial, as well as during the rest periods at the beginning and end of each run. Four predictors modeled the touch localization delay (uncrossed, crossed foot posture x stimulation of left, right foot). Eight predictors modeled the planning and movement execution delays (uncrossed, crossed foot posture x stimulation of left, right foot x pro-, anti-movement). Finally, all trial phases that contained behavioral errors were assigned to a common predictor. Recall, that foot posture was either uncrossed or crossed in a given run. Therefore, any given run contained only 12 (1 baseline, 2 tactile locations, 4 planning, 4 execution, 1 error) of the 23 experiment-wide predictors. We did not include head motion parameters as nuisance regressors, as our unwarping and realignment preprocessing included a correction of the fMRI images for movement-related artefacts129, but see Fig. S4 for results with motion parameters included.
We performed MVPA with The Decoding Toolbox128 version 3.997, using run-wise β-images for the different trial phases and experimental conditions estimated by the GLM that reflected the voxel-wise amplitude of the hemodynamic response function. We used an L2-norm support vector machine (SVM) as classifier in the implementation of LIBSVM130, with a fixed cost of c = 1. For whole-brain unbiased voxel selection, we applied a spherical searchlight with a radius of 4 voxels60. On each classification fold, the classifier was trained with input patterns from run-wise β-images to differentiate between two classes, such as movement goal left vs. movement goal right. Note that the two decoded classes were always present on both uncrossed and crossed runs, thus balancing any overall response differences between different postures. Classifier performance was validated with a leave-one-out cross validation design that predicted each of the 12 runs after training based on the respective 11 other runs. We report the mean decoding accuracy across all classification iterations, depicted at the center voxel of a given searchlight region, as the measure of the overall generalization performance of the classifier of that searchlight region. We repeated this procedure for all recorded voxels, resulting in a whole-brain map of averaged decoding accuracy for the two tested conditions across all test runs. For group-level statistical analysis, we normalized participant-level accuracy maps to MNI space based on the transformation parameters obtained during segmentation of the T1 image, and applied 6 mm Gaussian kernel smoothing. Across participants and for each classifier, we used the SnPM13 toolbox to test which brain voxels contained accuracy values that differed significantly from chance-level (50%) using a one-sample t-test131,132. Group-level tests covered the whole-brain except for the cerebellum and a small ventral portion of the anterior temporal lobe. Significance was determined using whole-brain cluster-based permutation tests using an initial threshold of p < 0.001 and a secondary family-wise error (FWE) correction rate of p < 0.05133. Voxels that survive this correction for multiple comparisons indicate locations where the decoding accuracy of the classifier is significantly better than chance for the differentiation between the two tested conditions. Put differently, multivariate patterns surrounding the identified voxels contain information about the differentiation between the two classes of interest, which is interpreted as their feature representation60,66,134,135,136,137. Visualizations of significantly higher than chance classification accuracy are based on mapping of activations identified in the 3D brain volume onto an inflated atlas of the cortical surface (Conte-69 atlas)138 using the Computerized Anatomical Reconstruction and Editing Toolkit (Caret)139.
Regions of interest (ROI)
We visualized decoding accuracy by adapting an approach that allows for inter-individual differences regarding anatomical and functional brain organization61,140. First, we defined regions of interest (ROI) based on the group-level t-map of significant above-chance decoding accuracy. For distinct local clusters, ROIs were created as a sphere with 6 mm radius centered on the voxel with the highest statistical significance for decoding. For large clusters, we created several ROIs based on peaks within the cluster, theoretical considerations, and delineation of regions suggested by previous studies. Second, we extracted, for each individual participant, the decoding accuracy of all voxels contained in each respective group ROI and identified the voxel with the highest decoding accuracy. We created participant-specific ROIs around individual peak voxels, again sphere-shaped with a 6 mm radius. Some ROIs overlapped partially, and were therefore combined whenever they spanned a similar anatomical location and showed similar means and variances of decoding accuracy across subjects. Group mean, individual means and bootstrapped confidence intervals of decoding accuracy were calculated with the Hmisc package141 in R142 and visualized using ggplot2143. We determined the structural brain regions that were associated with each ROI using the anatomy toolbox for SPM144 and labeled them based on previous studies related to goal-directed reaching and pointing109.
Reporting summary
Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.
Data availability
The processed data generated in this study have been deposited in an OSF repository available at [https://doi.org/10.17605/OSF.IO/5BN2V]. The raw data are protected and are not available due to subject confidentiality requirements. The Conte-69 atlas is available as part of the Caret software package available at: https://sites.wustl.edu/vanessenlab/resources/. The ROI classification accuracy data generated in this study are provided in the Source Data file. Source data are provided with this paper.
Code availability
Code is available at: https://doi.org/10.17605/OSF.IO/5BN2V.
References
Penfield, W. & Boldrey, E. Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain 60, 389–443 (1937).
Roux, F., Djidjeli, I. & Durand, J. Functional architecture of the somatosensory homunculus detected by electrostimulation. J. Physiol. 596, 941–956 (2018).
Yamamoto, S. & Kitazawa, S. Reversal of subjective temporal order due to arm crossing. Nat. Neurosci. 4, 759–765 (2001).
Shore, D. I., Spry, E. & Spence, C. Confusing the mind by crossing the hands. Cogn. Brain Res. 14, 153–163 (2002).
Tamè, L., Azañón, E. & Longo, M. R. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front. Psychol. 10, 291 (2019).
Medina, J. & Coslett, H. B. From maps to form to space: Touch and the body schema. Neuropsychologia 48, 645–654 (2010).
Heed, T., Buchholz, V. N., Engel, A. K. & Röder, B. Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends Cogn. Sci. 19, 251–258 (2015).
Heed, T., Backhaus, J., Röder, B. & Badde, S. Disentangling the External Reference Frames Relevant to Tactile Localization. PLOS ONE 11, e0158829 (2016).
Batista, A. P., Buneo, C. A., Snyder, L. H. & Andersen, R. A. Reach plans in eye-centered coordinates. Science 285, 257–260 (1999).
Bernier, P.-M. & Grafton, S. T. Human posterior parietal cortex flexibly determines reference frames for reaching based on sensory context. Neuron 68, 776–788 (2010).
Bhattacharyya, R., Musallam, S. & Andersen, R. A. Parietal Reach Region Encodes Reach Depth Using Retinal Disparity and Vergence Angle Signals. J. Neurophysiol. 102, 805–816 (2009).
Bosco, A., Breveglieri, R., Hadjidimitrakis, K., Galletti, C. & Fattori, P. Reference frames for reaching when decoupling eye and target position in depth and direction. Sci. Rep. 6, 21646 (2016).
Cohen, Y. E. & Andersen, R. A. A common reference frame for movement plans in the posterior parietal cortex. Nat. Rev. Neurosci. 3, 553–562 (2002).
Marzocchi, N., Breveglieri, R., Galletti, C. & Fattori, P. Reaching activity in parietal area V6A of macaque: eye influence on arm activity or retinocentric coding of reaching movements? Eur. J. Neurosci. 27, 775–789 (2008).
Medendorp, W. P., Goltz, H. C., Vilis, T. & Crawford, J. D. Gaze-Centered Updating of Visual Space in Human Parietal Cortex. J. Neurosci. 23, 6209–6214 (2003).
Pesaran, B., Nelson, M. J. & Andersen, R. A. Dorsal Premotor Neurons Encode the Relative Position of the Hand, Eye, and Goal during Reach Planning. Neuron 51, 125–134 (2006).
Bremner, L. R. & Andersen, R. A. Coding of the Reach Vector in Parietal Area 5d. Neuron 75, 342–351 (2012).
Bremner, L. R. & Andersen, R. A. Temporal Analysis of Reference Frames in Parietal Cortex Area 5d during Reach Planning. J. Neurosci. 34, 5273–5284 (2014).
Brozzoli, C., Gentile, G., Petkova, V. I. & Ehrsson, H. H. fMRI Adaptation Reveals a Cortical Mechanism for the Coding of Space Near the Hand. J. Neurosci. 31, 9023–9031 (2011).
Brozzoli, C., Gentile, G. & Ehrsson, H. H. That’s near my hand! Parietal and premotor coding of hand-centered space contributes to localization and self-attribution of the hand. J. Neurosci. 32, 14573–14582 (2012).
Buneo, C. A., Jarvis, M. R., Batista, A. P. & Andersen, R. A. Direct visuomotor transformations for reaching. Nature 416, 632–636 (2002).
Ferraina, S. et al. Reaching in Depth: Hand Position Dominates over Binocular Eye Position in the Rostral Superior Parietal Lobule. J. Neurosci. 29, 11461–11470 (2009).
Makin, T. R., Holmes, N. P. & Zohary, E. Is That Near My Hand? Multisensory Representation of Peripersonal Space in Human Intraparietal Sulcus. J. Neurosci. 27, 731–740 (2007).
Piserchia, V. et al. Mixed Body/Hand Reference Frame for Reaching in 3D Space in Macaque Parietal Area PEc. Cereb. Cortex 27, 1976–1990 (2016).
Duhamel, J.-R., Colby, C. L. & Goldberg, M. E. Ventral intraparietal area of the macaque: congruent visual and somatic response properties. J. Neurophysiol. 79, 126–136 (1998).
Foster, C., Sheng, W.-A., Heed, T. & Ben Hamed, S. The macaque ventral intraparietal area has expanded into three homologue human parietal areas. Prog. Neurobiol. 209, 102185 (2022).
Graziano, M. S. A. & Cooke, D. F. Parieto-frontal interactions, personal space, and defensive behavior. Neuropsychologia 44, 845–859 (2006).
Chen, X., DeAngelis, G. C. & Angelaki, D. E. Flexible egocentric and allocentric representations of heading signals in parietal cortex. Proc. Natl Acad. Sci. 115, E3305–E3312 (2018).
Leoné, F. T. M., Monaco, S., Henriques, D. Y. P., Toni, I. & Medendorp, W. P. Flexible Reference Frames for Grasp Planning in Human Parietofrontal Cortex. eNeuro 2, ENEURO.0008–15 (2015). 2015.
Galletti, C. & Fattori, P. The dorsal visual stream revisited: Stable circuits or dynamic pathways? Cortex 98, 203–217 (2018).
Seelke, A. M. H. et al. Topographic Maps within Brodmann’s Area 5 of Macaque Monkeys. Cereb. Cortex 22, 1834–1850 (2012).
Huang, R.-S., Chen, C.-f, Tran, A. T., Holstein, K. L. & Sereno, M. I. Mapping multisensory parietal face and body areas in humans. Proc. Natl Acad. Sci. 109, 18114–18119 (2012).
Sereno, M. I. & Huang, R.-S. A human parietal face area contains aligned head-centered visual and tactile maps. Nat. Neurosci. 9, 1337–1343 (2006).
Buchholz, V. N., Jensen, O. & Medendorp, W. P. Parietal oscillations code nonvisual reach targets relative to gaze and body. J. Neurosci. 33, 3492–3499 (2013).
Schubert, J. T. W. et al. Oscillatory activity reflects differential use of spatial reference frames by sighted and blind individuals in tactile attention. NeuroImage 117, 417–428 (2015).
Schubert, J. T. W. et al. Alpha-band oscillations reflect external spatial coding for tactile stimuli in sighted, but not in congenitally blind humans. Sci. Rep. 9, 9215 (2019).
Azañón, E., Longo, M. R., Soto-Faraco, S. & Haggard, P. The posterior parietal cortex remaps touch into external space. Curr. Biol. 20, 1304–1309 (2010).
Bolognini, N. & Maravita, A. Proprioceptive alignment of visual and somatosensory maps in the posterior parietal cortex. Curr. Biol. 17, 1890–1895 (2007).
Ruzzoli, M. & Soto-Faraco, S. Alpha stimulation of the human parietal cortex attunes tactile perception to external space. Curr. Biol. 24, 329–332 (2014).
Filimon, F., Nelson, J. D., Huang, R.-S. & Sereno, M. I. Multiple Parietal Reach Regions in Humans: Cortical Representations for Visual and Proprioceptive Feedback during On-Line Reaching. J. Neurosci. 29, 2961–2971 (2009).
Parkinson, A., Condon, L. & Jackson, S. R. Parietal cortex coding of limb posture: In search of the body-schema. Neuropsychologia 48, 3228–3234 (2010).
Pellijeff, A., Bonilha, L., Morgan, P. S., McKenzie, K. & Jackson, S. R. Parietal updating of limb posture: An event-related fMRI study. Neuropsychologia 44, 2685–2690 (2006).
Buchholz, V. N., Jensen, O. & Medendorp, W. P. Multiple reference frames in cortical oscillatory activity during tactile remapping for saccades. J. Neurosci. 31, 16864–16871 (2011).
Macaluso, E., Frith, C. D. & Driver, J. Delay Activity and Sensory-Motor Translation During Planned Eye or Hand Movements to Visual or Tactile Targets. J. Neurophysiol. 98, 3081–3094 (2007).
Chafee, M. V. & Crowe, D. A. Thinking in spatial terms: decoupling spatial representation from sensorimotor control in monkey posterior parietal areas 7a and LIP. Front. Integr. Neurosci. 6, 112 (2013).
Freedman, D. J. & Ibos, G. An Integrative Framework for Sensory, Motor, and Cognitive Functions of the Posterior Parietal Cortex. Neuron 97, 1219–1234 (2018).
Oristaglio, J., Schneider, D. M., Balan, P. F. & Gottlieb, J. Integration of Visuospatial and Effector Information during Symbolically Cued Limb Movements in Monkey Lateral Intraparietal Area. J. Neurosci. 26, 8310–8319 (2006).
Connolly, J. D., Goodale, M. A., Desouza, J. F. X., Menon, R. S. & Vilis, T. A Comparison of Frontoparietal fMRI Activation During Anti-Saccades and Anti-Pointing. J. Neurophysiol. 84, 1645–1655 (2000).
Gail, A. & Andersen, R. A. Neural Dynamics in Monkey Parietal Reach Region Reflect Context-Specific Sensorimotor Transformations. J. Neurosci. 26, 9376–9384 (2006).
Gertz, H. & Fiehler, K. Human posterior parietal cortex encodes the movement goal in a pro-/anti-reach task. J. Neurophysiol. 114, 170–183 (2015).
Hallett, P. E. Primary and secondary saccades to goals defined by instructions. Vis. Res. 18, 1279–1296 (1978).
Munoz, D. P. & Everling, S. Look away: the anti-saccade task and the voluntary control of eye movement. Nat. Rev. Neurosci. 5, 218–228 (2004).
Zhang, M. & Barash, S. Neuronal switching of sensorimotor transformations for antisaccades. Nature 408, 971–975 (2000).
Cappadocia, D. C., Monaco, S., Chen, Y., Blohm, G. & Crawford, J. D. Temporal Evolution of Target Representation, Movement Direction Planning, and Reach Execution in Occipital–Parietal–Frontal Cortex: An fMRI Study. Cereb. Cortex 27, 5242–5260 (2017).
Gertz, H., Lingnau, A. & Fiehler, K. Decoding Movement Goals from the Fronto-Parietal Reach Network. Front. Hum. Neurosci. 11, 84 (2017).
Medendorp, W. P., Goltz, H. C. & Vilis, T. Remapping the Remembered Target Location for Anti-Saccades in Human Posterior Parietal Cortex. J. Neurophysiol. 94, 734–740 (2005).
Van Der Werf, J., Jensen, O., Fries, P. & Medendorp, W. P. Gamma-Band Activity in Human Posterior Parietal Cortex Encodes the Motor Goal during Delayed Prosaccades and Antisaccades. J. Neurosci. 28, 8397–8405 (2008).
Heed, T., Beurze, S. M., Toni, I., Röder, B. & Medendorp, W. P. Functional Rather than Effector-Specific Organization of Human Posterior Parietal Cortex. J. Neurosci. 31, 3066–3076 (2011).
Heed, T., Leone, F. T. M., Toni, I. & Medendorp, W. P. Functional versus effector-specific organization of the human posterior parietal cortex: revisited. J. Neurophysiol. 116, 1885–1899 (2016).
Kriegeskorte, N., Goebel, R. & Bandettini, P. Information-based functional brain mapping. Proc. Natl Acad. Sci. USA 103, 3863–3868 (2006).
Ariani, G., Wurm, M. F. & Lingnau, A. Decoding Internally and Externally Driven Movement Plans. J. Neurosci. 35, 14160–14171 (2015).
Barany, D. A., Della-Maggiore, V., Viswanathan, S., Cieslak, M. & Grafton, S. T. Feature Interactions Enable Decoding of Sensorimotor Transformations for Goal-Directed Movement. J. Neurosci. 34, 6860–6873 (2014).
Gallivan, J. P., McLean, D. A., Valyear, K. F., Pettypiece, C. E. & Culham, J. C. Decoding Action Intentions from Preparatory Brain Activity in Human Parieto-Frontal Networks. J. Neurosci. 31, 9599–9610 (2011).
Kim, J., Bülthoff, I., Kim, S.-P. & Bülthoff, H. H. Shared neural representations of tactile roughness intensities by somatosensation and touch observation using an associative learning method. Sci. Rep. 9, 77 (2019).
van Kemenade, B. M. et al. Tactile and visual motion direction processing in hMT+/V5. NeuroImage 84, 420–427 (2014).
Kaplan, J. T., Man, K. & Greening, S. G. Multivariate cross-classification: applying machine learning techniques to characterize abstraction in neural representations. Front. Hum. Neurosci. 9, 151 (2015).
Beurze, S. M., de Lange, F. P., Toni, I. & Medendorp, W. P. Integration of Target and Effector Information in the Human Brain During Reach Planning. J. Neurophysiol. 97, 188–199 (2007).
Buneo, C. A. & Andersen, R. A. The posterior parietal cortex: Sensorimotor interface for the planning and online control of visually guided movements. Neuropsychologia 44, 2594–2606 (2006).
Stoet, G. & Snyder, L. H. Single Neurons in Posterior Parietal Cortex of Monkeys Encode Cognitive Set. Neuron 42, 1003–1012 (2004).
Beurze, S. M., de Lange, F. P., Toni, I. & Medendorp, W. P. Spatial and Effector Processing in the Human Parietofrontal Network for Reaches and Saccades. J. Neurophysiol. 101, 3053–3062 (2009).
Beurze, S. M., Toni, I., Pisella, L. & Medendorp, W. P. Reference Frames for Reach Planning in Human Parietofrontal Cortex. J. Neurophysiol. 104, 1736–1745 (2010).
Sereno, M. I. & Huang, R.-S. Multisensory maps in parietal cortex. Curr. Opin. Neurobiol. 24, 39–46 (2014).
Medendorp, W. P. & Heed, T. State estimation in posterior parietal cortex: Distinct poles of environmental and bodily states. Prog. Neurobiol. 183, 101691 (2019).
Dietrich, C. et al. Dermatomal Organization of SI Leg Representation in Humans: Revising the Somatosensory Homunculus. Cereb. Cortex 27, 4564–4569 (2017).
Avanzini, P. et al. Four-dimensional maps of the human somatosensory system. Proc. Natl Acad. Sci. 113, E1936–E1943 (2016).
Pleger, B. & Villringer, A. The human somatosensory system: From perception to decision making. Prog. Neurobiol. 103, 76–97 (2013).
Ruben, J. et al. Somatotopic Organization of Human Secondary Somatosensory Cortex. Cereb. Cortex 11, 463–473 (2001).
Maij, F., Seegelke, C., Medendorp, W. P. & Heed, T. External location of touch is constructed post-hoc based on limb choice. eLife 9, e57804 (2020).
Badde, S., Röder, B. & Heed, T. Feeling a Touch to the Hand on the Foot. Curr. Biol. 29, 1491–1497.e4 (2019).
Badde, S. & Heed, T. The hands’ default location guides tactile spatial selectivity. Proc. Natl Acad. Sci. 120, e2209680120 (2023).
Konen, C. S. & Kastner, S. Representation of Eye Movements and Stimulus Motion in Topographically Organized Areas of Human Posterior Parietal Cortex. J. Neurosci. 28, 8361–8375 (2008).
Medendorp, W. P., Goltz, H. C., Crawford, J. D. & Vilis, T. Integration of Target and Effector Information in Human Posterior Parietal Cortex for the Planning of Action. J. Neurophysiol. 93, 954–962 (2005).
Sereno, M. I., Pitzalis, S. & Martinez, A. Mapping of Contralateral Space in Retinotopic Coordinates by a Parietal Cortical Area in Humans. Science 294, 1350–1354 (2001).
Brotchie, P. R., Andersen, R. A., Snyder, L. H. & Goodman, S. J. Head position signals used by parietal neurons to encode locations of visual stimuli. Nature 375, 232–235 (1995).
Snyder, L. H., Grieve, K. L., Brotchie, P. & Andersen, R. A. Separate body- and world-referenced representations of visual space in parietal cortex. Nature 394, 887–891 (1998).
Stricanne, B., Andersen, R. A. & Mazzoni, P. Eye-centered, head-centered, and intermediate coding of remembered sound locations in area LIP. J. Neurophysiol. 76, 2071–2076 (1996).
Bisley, J. W. & Goldberg, M. E. Attention, Intention, and Priority in the Parietal Lobe. Annu. Rev. Neurosci. 33, 1–21 (2010).
Gottlieb, J. P., Kusunoki, M. & Goldberg, M. E. The representation of visual salience in monkey parietal cortex. Nature 391, 481–484 (1998).
Quiroga, R. Q., Snyder, L. H., Batista, A. P., Cui, H. & Andersen, R. A. Movement Intention Is Better Predicted than Attention in the Posterior Parietal Cortex. J. Neurosci. 26, 3615–3620 (2006).
Chen, Y. et al. Allocentric versus Egocentric Representation of Remembered Reach Targets in Human Cortex. J. Neurosci. 34, 12515–12526 (2014).
Gallivan, J. P., McLean, D. A., Smith, F. W. & Culham, J. C. Decoding Effector-Dependent and Effector-Independent Movement Intentions from Human Parieto-Frontal Brain Activity. J. Neurosci. 31, 17149–17168 (2011).
Wall, M. B. & Smith, A. T. The Representation of Egomotion in the Human Brain. Curr. Biol. 18, 191–194 (2008).
Renzi, C. et al. Spatial remapping in the audio-tactile ventriloquism effect: A TMS investigation on the role of the ventral intraparietal area. J. Cogn. Neurosci. 25, 790–801 (2013).
Sterzi, R. et al. Hemianopia, hemianaesthesia, and hemiplegia after right and left hemisphere damage. A hemispheric difference. J. Neurol. Neurosurg. Psych. 56, 308–310 (1993).
Vallar, G. Spatial Frames of Reference and Somatosensory Processing: A Neuropsychological Perspective. Philos. Trans. Biol. Sci. 352, 1401–1409 (1997).
Lloyd, D. M., Shore, D. I., Spence, C. & Calvert, G. A. Multisensory representation of limb position in human premotor cortex. Nat. Neurosci. 6, 17–18 (2003).
Crollen, V. et al. Visual Experience Shapes the Neural Networks Remapping Touch into External Space. J. Neurosci. 37, 10097–10103 (2017).
Miller, L. E. et al. A neural surveyor to map touch on the body. Proc. Natl Acad. Sci. 119, e2102233118 (2022).
Badde, S., Heed, T. & Röder, B. Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach. Psychon. Bull. Rev. 23, 387–404 (2015).
Badde, S. & Heed, T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cogn. Neuropsychol. 33, 1–22 (2016).
Mueller, S. & Fiehler, K. Mixed body- and gaze-centered coding of proprioceptive reach targets after effector movement. Neuropsychologia 87, 63–73 (2016).
Heed, T. & Röder, B. Common anatomical and external coding for hands and feet in tactile attention: evidence from event-related potentials. J. Cogn. Neurosci. 22, 184–202 (2010).
Wada, M. et al. Spatio-Temporal Updating in the Left Posterior Parietal Cortex. PLoS ONE 7, e39800 (2012).
Dekleva, B. M., Kording, K. P. & Miller, L. E. Single reach plans in dorsal premotor cortex during a two-target task. Nat. Commun. 9, 3556 (2018).
Shushruth, S., Zylberberg, A. & Shadlen, M. N. Sequential sampling from memory underlies action selection during abstract decision-making. Curr. Biol. 32, 1949–1960.e5 (2022).
Blangero, A., Menz, M. M., McNamara, A. & Binkofski, F. Parietal modules for reaching. Neuropsychologia 47, 1500–1507 (2009).
Culham, J. C. & Valyear, K. F. Human parietal cortex in action. Curr. Opin. Neurobiol. 16, 205–212 (2006).
Filimon, F. Human Cortical Control of Hand Movements: Parietofrontal Networks for Reaching, Grasping, and Pointing. Neuroscientist 16, 388–407 (2010).
Gallivan, J. P. & Culham, J. C. Neural coding within human brain areas involved in actions. Curr. Opin. Neurobiol. 33, 141–149 (2015).
Medendorp, W. P., Buchholz, V. N., Van Der Werf, J. & Leoné, F. T. M. Parietofrontal circuits in goal‐oriented behaviour. Eur. J. Neurosci. 33, 2017–2027 (2011).
Vesia, M. & Crawford, J. D. Specialization of reach function in human posterior parietal cortex. Exp. Brain Res. 221, 1–18 (2012).
Bernier, P.-M., Cieslak, M. & Grafton, S. T. Effector selection precedes reach planning in the dorsal parietofrontal cortex. J. Neurophysiol. 108, 57–68 (2012).
Fernandez-Ruiz, J., Goltz, H. C., DeSouza, J. F. X., Vilis, T. & Crawford, J. D. Human Parietal “Reach Region” Primarily Encodes Intrinsic Visual Direction, Not Extrinsic Movement Direction, in a Visual–Motor Dissociation Task. Cereb. Cortex 17, 2283–2292 (2007).
Fabbri, S., Strnad, L., Caramazza, A. & Lingnau, A. Overlapping representations for grip type and reach direction. NeuroImage 94, 138–146 (2014).
Gallivan, J. P., McLean, D. A., Flanagan, J. R. & Culham, J. C. Where One Hand Meets the Other: Limb-Specific and Action-Dependent Movement Plans Decoded from Preparatory Signals in Single Human Frontoparietal Brain Areas. J. Neurosci. 33, 1991–2008 (2013).
Gallivan, J. P., Johnsrude, I. S. & Flanagan, J. R. Planning Ahead: Object-Directed Sequential Actions Decoded from Human Frontoparietal and Occipitotemporal Networks. Cereb. Cortex 26, 708–730 (2015).
Astafiev, S. V. et al. Functional Organization of Human Intraparietal and Frontal Cortex for Attending, Looking, and Pointing. J. Neurosci. 23, 4689–4699 (2003).
Konen, C. S., Mruczek, R. E. B., Montoya, J. L. & Kastner, S. Functional organization of human posterior parietal cortex: grasping- and reaching-related activations relative to topographically organized cortex. J. Neurophysiol. 109, 2897–2908 (2013).
Cavina-Pratesi, C. et al. Human neuroimaging reveals the subcomponents of grasping, reaching and pointing actions. Cortex 98, 128–148 (2018).
Fuchs, X., Wulff, D. U. & Heed, T. Online Sensory Feedback During Active Search Improves Tactile Localization. J. Exp. Psychol. Hum. Percept. Perform. 46, 697–715 (2020).
Arikan, B. E., Voudouris, D., Voudouri-Gertz, H., Sommer, J. & Fiehler, K. Reach-relevant somatosensory signals modulate activity in the tactile suppression network. NeuroImage 236, 118000 (2021).
Gertz, H., Voudouris, D. & Fiehler, K. Reach-relevant somatosensory signals modulate tactile suppression. J. Neurophysiol. 117, 2262–2268 (2017).
Graziano, M. S. A. Ethological Action Maps: A Paradigm Shift for the Motor Cortex. Trends Cogn. Sci. 20, 121–132 (2016).
Pouget, A., Dayan, P. & Zemel, R. Information processing with population codes. Nat. Rev. Neurosci. 1, 125–132 (2000).
Oldfield, R. C. The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 9, 97–113 (1971).
Leoné, F. T. M., Heed, T., Toni, I. & Medendorp, W. P. Understanding Effector Selectivity in Human Posterior Parietal Cortex by Combining Information Patterns and Activation Measures. J. Neurosci. 34, 7102–7112 (2014).
Brooks, J. L. Counterbalancing for serial order carryover effects in experimental condition orders. Psychol. Methods 17, 600–614 (2012).
Hebart, M. N., Görgen, K. & Haynes, J.-D. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data. Front. Neuroinform. 8, 88 (2015).
Andersson, J. L. R., Hutton, C., Ashburner, J., Turner, R. & Friston, K. Modeling Geometric Deformations in EPI Time Series. NeuroImage 13, 903–919 (2001).
Chang, C.-C. & Lin, C.-J. LIBSVM: A Library for Support Vector Machines. ACM Trans. Intell. Syst. Technol. 2, 1–27 (2011). 27.
Haxby, J. V. et al. Distributed and Overlapping Representations of Faces and Objects in Ventral Temporal Cortex. Science 293, 2425–2430 (2001).
Haynes, J.-D. et al. Reading Hidden Intentions in the Human Brain. Curr. Biol. 17, 323–328 (2007).
Nichols, T. E. & Holmes, A. P. Nonparametric permutation tests for functional neuroimaging: A primer with examples. Hum. Brain Mapp. 15, 1–25 (2002).
Haxby, J. V., Connolly, A. C. & Guntupalli, J. S. Decoding Neural Representational Spaces Using Multivariate Pattern Analysis. Annu. Rev. Neurosci. 37, 435–456 (2014).
Haynes, J.-D. A Primer on Pattern-Based Approaches to fMRI: Principles, Pitfalls, and Perspectives. Neuron 87, 257–270 (2015).
Haynes, J.-D. & Rees, G. Decoding mental states from brain activity in humans. Nat. Rev. Neurosci. 7, 523–534 (2006).
Norman, K. A., Polyn, S. M., Detre, G. J. & Haxby, J. V. Beyond mind-reading: multi-voxel pattern analysis of fMRI data. Trends Cogn. Sci. 10, 424–430 (2006).
Van Essen, D. C. et al. Parcellations and Hemispheric Asymmetries of Human Cerebral Cortex Analyzed on Surface-Based Atlases. Cereb. Cortex 22, 2241–2262 (2012).
Van Essen, D. C. Cortical cartography and Caret software. NeuroImage 62, 757–764 (2012).
Oosterhof, N. N., Tipper, S. P. & Downing, P. E. Viewpoint (In)dependence of Action Representations: An MVPA Study. J. Cogn. Neurosci. 24, 975–989 (2012).
Harrell, F. E. Hmisc: A package of miscellaneous R functions. Programs available from https://CRAN.R-project.org/package=Hmisc (2018).
R. Core Team. R: A Language and Environment for Statistical Computing. (R Foundation for Statistical Computing, 2014).
Wickham, H. Ggplot2: elegant graphics for data analysis. (Springer, 2009).
Eickhoff, S. B. et al. A new SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data. NeuroImage 25, 1325–1335 (2005).
Acknowledgements
We thank Selina Pradel and Lara Pleil for help with data acquisition. This work was funded by an Emmy Noether grant of the German Research Foundation (DFG) to T.H. (He 6368/1-1, 1-2, 1-3) and a German Research Foundation (DFG) grant to T.H. (He 6368/4-1) in the DFG/ANR program for German-French Projects in the Natural, Life, and Engineering Sciences. WPM was supported by the Netherlands Organization for Scientific Research for this work (NWO-VICI: 453-11-001) and is currently supported by NWA-ORC- 1292.19.298.
Author information
Authors and Affiliations
Contributions
J.K. contributed to study design, data acquisition, planning and execution of the analysis, figure creation, and writing of the manuscript. She was responsible for data acquisition. C.F. contributed to data analysis, figure creation, and writing of the manuscript. W.P.M. contributed to study design, planning of the analysis, and writing of the manuscript. T.H. contributed to study design, planning of the analysis and writing of the manuscript, and supervised the study.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks Gregory Króliczak and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. A peer review file is available
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Source data
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Klautke, J., Foster, C., Medendorp, W.P. et al. Dynamic spatial coding in parietal cortex mediates tactile-motor transformation. Nat Commun 14, 4532 (2023). https://doi.org/10.1038/s41467-023-39959-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-023-39959-4