Abstract
The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or direction is inaccurate; yielding a surprisingly large perceptual bias. To investigate such perceptual bias, this study presented tactile motion stimuli to healthy adult participants in a variety of finger and head postures, and requested the participants to report the perceived direction of motion mapped on a video screen placed on the frontoparallel plane in front of the eyes. Experimental results showed that the perceptual bias could be divided into systematic and nonsystematic biases. Systematic bias, defined as the mean difference between the perceived and veridical directions, correlated linearly with the relative posture between the finger and the head. By contrast, nonsystematic bias, defined as minor difference in bias for different stimulus directions, was highly individualized, phase-locked to stimulus orientation presented on the skin. Overall, the present findings on systematic bias indicate that the transformation bias among the reference frames is dominated by the finger-to-head posture. Moreover, the highly individualized nature of nonsystematic bias reflects how information is obtained by the orientation-selective units in the S1 cortex.
Similar content being viewed by others
Introduction
A hallmark of hand function is to manipulate objects and acquire tactile information, a process denoted as haptics. Furthermore, human manipulation of objects in a series of haptic process requires an integrated neural representation of the body (body schema) and of the space around the body1,2,3,4,5. When touching an object with hands, we perceive tactile motion6 that is crucial toward determining the direction7,8,9,10,11 and speed8,12,13 of the object, as well as for planning subsequent movements14. While the tactile motion perceived by the skin of hands is encoded in the somatotopic (skin-centered) reference frame15,16, the physical movement of the object is actually represented in the allocentric (or external world-centered) reference frame17,18.
Misalignment between the reference frames frequently causes a bias in perceiving tactile motion11,19. Therefore, tactile remapping is the outcome of multimodality integration that has been shown to be affected by finger postures20, body postures21, and transformation of eye- and body-centered reference frames22,23, indicating a large-scale transformation and integration involving multiple reference frames24,25,26,27. For example, perceived visual28 and tactile29 motion directions can be affected by hand and arm postures30,31. Studies on temporal integration suggested that multimodal information was processed by a recurrent scheme among associated cortical areas32,33. Importantly, sensorimotor contingency, a systematic co-occurrence of sensory and motor events, can produce a new reference frame to improve motor performance34,35. The present study investigated the integration process as tactile motion projects on the visual reference frame while manipulating the upper limb and head postures.
The somatotopic frame must take the body posture into account because the perception of physical environments by the hand cannot be disassociated from certain body postures36. In practice, body posture is an integrated result derived from the positions of multiple joints37,38, including the shoulder, elbow, hand, and wrist39,40,41. Strong evidence suggests that humans have the ability to estimate the end-point of the body extremities (i.e., the hands) with a high degree of precision through the integration of proprioceptive information42,43,44. In other words, the body posture is well organized into the somatotopic frame. Regarding the cutaneous senses of the hand, previous studies have shown that the perception of tactile orientation by the hand cannot be explained by any single reference frame of the body posture. When performed on the horizontal45,46, midsagittal47, or frontoparallel plane48, the perception of tactile orientation must include information from the angles of multiple joints to be integrated with the cutaneous senses. In other words, tactile perceptual bias may be attributed to the existence of intermediate reference frames that are used for multi-sensory integration49,50,51.
However, perceptual bias may also stem directly from cutaneous senses. In general, cutaneous information must be integrated with information regarding the body posture in order to achieve precise haptic processes. For example, when participants were asked to judge whether a bar presented to one palm was parallel to a second bar presented to the other palm, the participants’ judgment was intermediate between the allocentric and somatotopic reference frames45,46. Furthermore, tactile direction and orientation presented to the left index fingerpad yielded a clockwise bias of 20°–25° when the left forearm was positioned in a forward and volar side up posture52,53. In other words, it appears that tactile orientation and direction share a common transformation process from an object’s physical condition to an individual’s perception.
The present study aimed to characterize the rules governing the perceptual bias underlying the transformation of reference frames. Tactile motions were presented to the left index fingerpad using a miniature tactile stimulator. A video screen was placed between the eyes of the participant and the stimulator, and the participants used a mouse to report the perceived direction of motion of the stimulus on the screen. This study used a design in which the stimulator, left index fingerpad, center of the video screen, and eyes were perfectly aligned along the posterior-to-anterior axis. In performing the trials, the participants’ head and finger postures were manipulated in a controlled manner in order to investigate the effect of these postures on perceptual bias. Results revealed that the perceptual bias was a linear summation of two different types of bias, namely systematic bias and nonsystematic bias. The systematic bias correlated linearly with the difference between the finger and the head postures for all participants. By contrast, the nonsystematic bias is highly individualized, and has a phase determined mainly by the finger posture. The two co-existing biases may represent the underlying principles governing the transformation of reference frames in the perception of tactile motion.
Results
Study design
This study evaluated the effect of the relative head and finger posture on motion perception at the fingerpad. A rotating aluminum ball with a groove depth of 500 μm, a wavelength of 4 mm, and a 45% duty cycle was used as the tactile stimulus to deliver tactile motions (Fig. 1a). Motion stimulus was delivered to the fingerpad of the left index finger using the miniature tactile stimulator with three motors that can precisely control the speed, direction and indentation depth of the stimulus54 (Fig. 1b). Each participant sat in front of a table with the angle of elbow joint kept constant. The participant’s left upper limb was supported by the arm holder with the palm facing the participant’s face such that the left index fingerpad contacted the aluminum ball during stimulation. The participant’s head, eyes, video display, stimulus, and left index fingerpad were precisely aligned along the posterior-to-anterior axis (Fig. 1c, see also Experimental set-up for details).
The perceived direction of the tactile motion presented to the participant’s left index fingerpad was analyzed for carefully controlled angles of the head and finger. In each trial, the rotating grating ball indented the fingerpad and was driven in a particular direction resulting in tactile stimulation. The participant visually fixated on a cross presented at the center of the video display and reported the perceived direction of motion (\({R}_{i}\)) using a mouse to click on an appropriate point on a circle shown on the video display (Fig. 2a). The stimulation trials were performed in accordance with a 3-by-4 factorial finger-and-head posture combination design consisting of three head postures and four finger postures. In particular, the finger postures (θF) were set as 90°, 60°, 30° or 0°, while the head postures (θH) were set as 120°, 90° or 60°; yielding a total of 12 different posture combinations (Fig. 2b). The effect of the head and finger postures on the perceived direction of tactile motion was evaluated by comparing the difference between the veridical and perceived directions across all postures.
Systematic bias
This study aimed to understand how the perceived direction of motion of the stimuli presented to the fingerpad was modulated by the relative posture of the finger and head. According to the data obtained from a single sample participant (Fig. 3a), the systematic bias was found to be modulated by both the head posture (θH) and the finger posture (θF). Specifically, the highest systematic bias occurred for θH = 120° and θF = 0°, while the lowest systematic bias occurred for θH = 60° and θF = 90°. Interestingly, the systematic bias was approximately zero for postures of θH = 120° and θF = 90°, θH = 90° and θF = 60°, and θH = 60° and θF = 30°, i.e., postures under which the difference between the finger and head angles was −30° (θH − θF = −30°). Notably, a clockwise shift of the finger induced a counterclockwise change in the systematic bias, indicating that the finger posture modulated the systematic bias in the opposite direction (Fig. 3b, middle). By contrast, a clockwise shift of the head posture resulted in a clockwise change in the systematic bias. In other words, the head posture modulated the systematic bias in the same direction (Fig. 3b, bottom). These patterns of the systematic bias were found to be similar across all six participants (Fig. 3c), with high values of pairwise correlations of systematic bias across finger-head postures (0.93 ± 0.04, Mean ± SD, across all participants, Fig. 3d).
In general, the results presented in Fig. 3 indicate that the systematic bias can be robustly predicted by the head and finger postures. A further analysis was thus performed to clarify the detailed relations of the systematic bias with finger, head and finger-head postures, respectively (Fig. 4a-c for a sample participant, and Fig. 4d-f for all six participants). Results showed that systematic bias was modulated by finger posture in the sample participant (Fig. 4a, for participant #1, slope = −0.65, R2 = 0.67, t = −4.53, p = 0.001, df = 10, data from averaged biases for each finger-head posture) and in the mean value across of all six participants (Fig. 4d, slope = −0.62, R2 = 0.72, t = −5.1, p < 0.001, df = 10, data from biases averaged across participants). For example, the systematic bias vs. finger posture plot had a slope of −0.65 for the sample participant and close to −0.62 for all participants (Fig. 4a,d); indicating that a change in finger posture induces a change in systematic bias in the opposite direction (see also Fig. 3b, middle). However, systematic bias was significantly modulated by head posture (Fig. 4b, for participant #1, slope = 0.60, R2 = 0.33, t = 2.26, p = 0.047, df = 10, data from averaged biases for each finger-head posture) in the sample participant but not in mean value of all participants (Fig. 4e, slope = 0.52, R2 = 0.27, t = 1.92, p = 0.083, df = 10, data from biases averaged across participants); indicating that a change in the head posture induces a change of the systematic bias in the same direction in the sample participant (see also Fig. 3b, bottom) but not in all participants.
An investigation was performed to determine the ability of finger posture to predict systematic bias when the head posture was controlled. A good goodness-of-fit (as measured by the R2 coefficient) was found for the finger posture in both the sample participant (Fig. 4a, for participant #1, θH = [120°, 90°, 60°]: slope = [−0.71, −0.55, −0.69], R2 = [0.59, 0.47, 0.63], t = [−18.37, −23.65, −25.32], all p < 0.001, df = 382, data from averaged biases for each finger-posture) and all the participants (Fig. 4d, for θH = [120°, 90°, 60°]: slope = [−0.64, −0.59, −0.63], R2 = [0.84, 0.89, 0.92], t = [−10.82, −13.46, −15.65], all p < 0.001, df = 22, data from biases averaged within participants). A further investigation was then conducted to examine the ability of head posture to predict systematic bias where the finger posture was controlled. Results also showed that a good goodness-of-fit for the head posture in both the sample participant (Fig. 4b, θF = [90°, 60°, 30°, 0°]: slope = [0.55, 0.59, 0.70, 0.55], R2 = [0.32, 0.43, 0.43, 0.31], t = [11.58, 14.55, 14.72, 11.32], all p < 0.001, df = 286, data from averaged biases for each finger-head posture) and all the participants (Fig. 4e, for θF = [90°, 60°, 30°, 0°]: slope = [0.52, 0.54, 0.51, 0.51], R2 = [0.62, 0.85, 0.71, 0.77], t = [5.14, 9.68, 6.29, 7.33], all p < 0.001, df = 16, data from biases averaged within participants). A final investigation was performed to examine the degree to which systematic bias could be predicted by the relative finger-head posture (i.e., the difference between finger posture and the head (θF-H)). Results showed almost perfect correlation of systematic bias with finger-head posture (Fig. 4c, for sample participant #1: slope = −0.63, R2 = 0.98, t = −22.56, p < 0.001, df = 10, data from averaged biases for each finger-head posture. Figure 4f, for all six participants, slope = −0.58, R2 = 0.99, t = −57.18, p < 0.001, df = 4, data from biases averaged across participants); indicating that finger-head posture almost completely determined systematic bias. Finally, the systematic bias vs. finger-head posture plot had a slope of −0.63 for the sample participant and −0.58 across all the participants. In other words, a change in the finger-head posture induced a change of systematic bias in the opposite direction with a similar magnitude across all six participants.
Nonsystematic bias
In addition to the systematic bias described above, which correlated linearly with the relative posture between the finger and the head, a further bias (denoted as nonsystematic bias) which exhibited different values for different directions of the tactile stimulation was also observed. Nonsystematic bias varied across all stimulus directions (Fig. 5a, blue line); hence, it was fitted with a cosine function. A preliminary investigation found that the best fit was obtained using a cosine function with moment = 2 (Fig. S1a). After the cosine fit was established, two parameters were extracted, namely the amplitude (A) of the nonsystematic bias at its peak position and the corresponding phase (θp) (Fig. 5b). No significant difference was found in the goodness-of-fit across all the finger postures, head postures, and relative finger-head postures (Fig. S1b, one-way repeated-measures ANOVA. For head posture, F(2, 42) = 0.59, p = 0.56. For finger posture, F(3, 45) = 0.63, p = 0.60. For finger - head posture, F(11, 44) = 0.87, p = 0.58).
Figure 6a shows the nonsystematic bias for all the finger-head posture combinations of the sample participant. When the finger shifts from a vertical position to a horizontal position (θF: from 90° to 0°), the phase of the nonsystematic bias shifts gradually and congruently (θp: from 87° to −13°). However, a change in head posture has no noticeable effect on the phase of the nonsystematic bias. For example, the phase of the nonsystematic bias remains virtually unchanged when the head is moved from the leftward to rightward postures (θH: from 120° to 60°).
An investigation was performed to examine the detailed relations of the finger, head and finger-head postures with the nonsystematic bias across all six participants and the factors affecting the amplitude of the nonsystematic bias (Fig. 6b,c for the phase of the nonsystematic bias, Fig. 6d,e for the amplitude of the nonsystematic bias). Correlation was found between the phase of the nonsystematic bias with both the finger posture (Fig. 6b left, for the sample participant #1: slope = 1.14, R2 = 0.93, t = 11.17, p < 0.001, df = 10, data from averaged biases for each finger-head posture. Figure 6c left, for all six participants: slope = 1.08, R2 = 0.73, t = 7.76, p < 0.001, df = 22, data from biases averaged across participants) and the finger-head posture (Fig. 6b right, for the sample participant: slope = 0.71, R2 = 0.55, t = 3.52, p = 0.006, df = 10, data from averaged biases for each finger-head posture. Figure 6c right, for all six participant: slope = 0.79, R2 = 0.72, t = 9.39, p < 0.001, df = 34, data from biases averaged across participants), but not between the phase and the head posture (Fig. 6b middle, for the sample participant: slope = 0.09, R2 = 0.003, t = 0.18, p = 0.86, df = 10, data from averaged biases for each finger-head posture. Figure 6c, middle, for all six participants: slope = −0.17, R2 = 0.06, t = −1.02, p = 0.32, df = 16, data from biases averaged across participants). The slope of the nonsystematic bias phase vs. the finger posture plot equaled approximately 1 (Fig. 6c, left). In other words, a change in finger posture had a positive and congruent effect on the phase of the nonsystematic bias. However, no correlation was found between any of the three postures and the amplitude of the nonsystematic bias; indicating that the amplitude of the nonsystematic bias is not posture-related (Fig. 6d for [left panel, middle panel, right panel]: slope = [−0.04, −0.06, −0.01], R2 = [0.09, 0.09, 0.004], t = [−1.48, −1.50, 0.30], p = [0.15, 0.15, 0.77], df = 22, data from participant #1. Figure 6e, for [left panel, middle panel, right panel]: slope = [0.002, −0.03, −0.012], R2 = [0.002, 0.04, 0.02], t = [0.07, −0.86, −0.77], p = [0.94, 0.40, 0.45], df = [22, 16, 34], data from biases averaged across participants). Pairwise correlations across all six participants equaled 0.71 ± 0.11 (R2, Mean ± SD) for the phase of the nonsystematic bias (Fig. S2a,b) and 0.15 ± 0.16 for the amplitude of the nonsystematic bias (Fig. S2c,d). Hence, it is inferred that the phase of the nonsystematic bias is consistent across all six participants, but the amplitude of the nonsystematic bias is highly individualized.
Given that the slope of the nonsystematic bias is close to 1 and the phase is similar across participants, this study further examined whether the phase of nonsystematic bias is anchored on the somatotopic reference frame or any other reference frame that is aligned with the skin, such as finger-centered or forearm-centered reference frames. To this end, the results shown in Fig. 6b,c when mapped on the somatotopic reference frame were analyzed (Fig. S3). It was found that the phase is consistent across finger postures when analyzed on the somatotopic reference frames, suggesting that nonsystematic bias is phase-locked to the skin or any other reference frame that is aligned with the skin.
Systematic vs nonsystematic bias
The systematic and nonsystematic bias differ in two main regards. First, the systematic bias is dominated by the combined information of the finger and head postures (Fig. 7a, for v.s. θF-H: Student’s t test, t = 5.61, p < 0.001, df = 10, data from regression coefficients of bias v.s. postures of all participants). On the contrary, the phase of the nonsystematic bias is primarily determined by the finger posture (Fig. 7a, for v.s. θF: Student’s t test, t = −2.31, p = 0.04, df = 10. Fig. 7a, for v.s. θH: Student’s t test, t = 7.13, p < 0.001, df = 10). The head posture correlates only weakly with the systematic bias and has no effect on the phase of the nonsystematic bias. Second, the changes in direction of the bias caused by posture shifts for the systematic bias and the phase of the nonsystematic bias are different. In particular, the slope is negative for the systematic bias and positive for the phase of the nonsystematic bias given changes in the head posture and finger-head posture. Similarly, the slope is positive for the systematic bias and negative for the phase of the nonsystematic bias given changes in the finger posture (Fig. 7b). A further investigation was performed to examine the relationship between the systematic bias and the phase of the nonsystematic bias (Fig. S4). Results showed a consistent negative correlation between the systematic bias and phase of the nonsystematic bias across all six participants; indicating that both biases are modulated by the finger postures.
The sample size of six participants may raise the issue of a lack of statistical power, and it remains unclear whether the aforementioned biases have adequate test-retest reliability. To this end, we performed test-retest experiments that included an additional six participants (Supplementary Text). The results showed that the systematic (Fig. S5) and nonsystematic (Fig. S6) biases were reliable between the test (Fig. S5a,b) and retest (Fig. S6a,b) experiments. Also, the results obtained in these additional six participants were analogous to those observed in the formal experiment (Figs. S5c,d and S6c,d).
Finally, we examined whether the posture-related changes of systematic and nonsystematic biases were simply mediated by the fact that the participants reported the finger or head postures. We showed that it was not the case as participants reported the perceived directions of motion instead of the finger or head postures (Fig. S7).
Discussion
The present study has revealed plausible mechanisms underlying the reference frame transformation between the somatotopic and allocentric reference frames in response to tactile stimulation. In particular, a consistent deviation in transformation, referred to in this study as a systematic bias, has been found between the veridical and observed directions of the allocentric reference frame. The systematic bias is similar across different participants and is perfectly and linearly predicted by the relative finger-head posture. An additional nonsystematic bias has also been observed: the phase of the nonsystematic bias differs with different stimulus directions and is modulated primarily by the finger posture (i.e., not the head posture). The nonsystematic bias can be well-fitted by a phase-locked cosine function with moment = 2. In other words, an identical nonsystematic bias is induced by tactile stimulations applied in opposite directions. This finding suggests that the nonsystematic bias is associated with inhomogeneous cutaneous senses and is probably involved with the orientation-selective units in the S1 cortex55. Furthermore, given a change in the finger or head orientation induced dramatically different changes in the systematic and nonsystematic biases, it is inferred that the two biases are mediated by distinctively different mechanisms.
The present results have shown that the systematic bias is linearly predicted by the relative finger-to-head posture; indicating the importance of both the finger posture and the head posture in determining the bias. These findings are reminiscent of the observations of Carter et al.56, who found that the eye position affected the perceived tactile direction, and those of Volcic et al.57, who found that the head posture affected the perceived direction in parallelity tests. Tactile orientation was originally hypothesized to be perfectly encoded in an allocentric reference frame. However, this notion was challenged by the data collected by Hammerschmidt18. Moreover, in recent years, parallelity and mental rotation experiments have shown that the perceived orientation cannot be explained by any single reference frame when performed on the horizontal45,47, midsagittal47, or frontoparallel planes48. However, a reference frame that is an intermediate of multiple frames can account for such tactile orientation bias49,50,51. Many different sources may contribute to the reference frame, including the position or posture of the skin16, hand19,58, arm59,60,61, and body62. These findings are analogous to those obtained in studies on motor-sensory coordination59,60,61, which show that a hybrid frame of reference is constructed to combine parallel multisensory information. In general, these previous findings imply a multisensory nature of tactile perception. That is, tactile information needs to be mapped onto other reference frames such that the somatosensory, visual, auditory and motor functions can be integrated as a single holistic system16,56,57,58,59,60,61,62,63,64.
The phase of the nonsystematic bias between the reported tactile direction and the veridical direction is anchored on the somatotopic reference frame (i.e., not the head posture). To the best of our knowledge, the nonsystematic bias of human touch has never been reported. However, the finding that nonsystematic bias can be fitted by a cosine function with a moment of two is reminiscent of the theory of tactile anisotropy (also known as the oblique effect), which states that tactile acuity tends to be better at certain orientations62,65,66. Neuronal data provide plausible support for the origin of the nonsystematic bias55. Specifically, some neurons in the primary somatosensory cortex are highly selective for the orientation or direction of scanning gratings55,67. In other words, transformation of the reference frames may be mediated by these orientation-selective units, in which a high percentage of neurons prefer the proximal-distal orientation68. Another possible explanation for the origin of the nonsystematic bias may be inhomogeneous finger compliance or inhomogeneous receptor properties69 in response to motion stimulus presented to the finger. However, all these suggestions seem possible because the nonsystematic bias is constant when calculating the bias in the somatotopic reference frame62,65,66,68,69.
The evidence presented in this study suggests that systematic and nonsystematic biases reflect two completely different properties underlying the transformation from the somatotopic frame to the other reference frames. In particular, systematic bias is determined primarily by the relative finger-head posture; reflecting its multisensory nature. By contrast, nonsystematic bias is determined only by the finger posture, indicating most probably that it has a somatosensory nature. Furthermore, the slope of the phase of nonsystematic bias is close to 1 for most of the participants, suggesting that nonsystematic bias is phase-locked to the skin or any other reference frame that is aligned with the skin, such as finger-centered and forearm-centered reference frames. The present findings are inconsistent with the hypothesis that all sensory modalities are remapped to a common frame of reference70,71,72. In fact, neurons in the posterior parietal cortex, such as the ventral intraparietal (VIP) area, apply a variety of reference frames, including intermediate somatosensory and visual reference frames, to encode the stimulus location73, indicating that the integration between touch and vision is mediated by a coexistence of multiple reference frames.
The formal experiment included only six participants, a sample size that might be susceptible to type II errors. Also, this small sample size could limit our ability to analyze the variance across participants. Except for that, the experimental design adopted in this study has several important advantages. First, it enables the linearity of change in bias phase to be examined as a function of posture; a property that parallels the gradual shift of the receptive field in multisensory neurons observed in the ventral intraparietal area73,74, lateral intraparietal area75, ventral premotor area76, and superior colliculus70,71. Second, the present study on the perceived direction of stimulus motion has ecological value as haptics usually involves motion between the finger and the object6,77. However, tactile flow, i.e., the motion information obtained by the finger, is still required for subsequent motor planning. Finally, the tactile stimulation applied in this study is implemented with a directional precision of 1° and an indentation depth precision of 1 μm54. As a result, it provides the means to extract bias patterns with relatively small magnitudes, such as the nonsystematic bias reported herein.
Methods
Participants
A total of six participants (four males, two females, 26 to 35 years of age) participated in the formal experiment. An additional eight participants (six males, two females, 20 to 36 years of age) participated in the test-retest experiments (Supplementary Text). The protocol was approved by the Institutional Review Board of Human Research of Chang Gung Medical Foundation and written informed consent was obtained from all participants. All methods were performed in accordance with the regulations of Human Subjects Research Act in Taiwan and with the guidelines of the Declaration of Helsinki, 1975.
Tactile stimulator
The ball measured 20 mm in diameter and was engraved with square-wave gratings with a depth of 500 μm, a wavelength of 4.0 mm and a 45% duty cycle (Fig. 1a). A three-motor controller was employed to control the tactile stimulus (Fig. 1b) so that the direction of motion and indentation depth could be precisely controlled (see Pei et al. 2014 for details). During the experiment, white noise was played through an earphone to prevent the participant from hearing the motor noise.
Experimental set-up
Each participant sat in front of a table with the left upper arm and forearm held by arm holders to maintain the arm position (Fig. 1c). The angle of the elbow joint was not measured but was kept constant across all the experiments. Specifically, for the left upper limb, the participant’s index finger and wrist were kept at neutral position (straight), forearm supinated at 90°, and elbow flexed at 90°. The finger orientation was adjusted by changing the participant’s shoulder abduction and internal rotation postures. In order to make sure that these postures were kept stationary during each session of the experiment; a forearm holder was utilized to support the wrist and an arm holder was employed to support the elbow. For the head posture, a pad was placed lateral to the head and the participant needs to tilt the head to touch the pad so that the head posture can be maintained.
The stimulus ball was placed immediately in front of the left index finger. In addition, a video display screen was placed between the participant and the tactile stimulator to provide experimental instructions and to enable the participant to report the perceived direction of motion of the ball. In setting up the experimental process, from posterior to anterior with respect to the participant’s head, the eyes, video display, tactile ball, and left index fingerpad were perfectly aligned along the posterior-to-anterior axis. The allocentric reference frame (coordinate) was defined on the frontoparallel plane (Fig. 1c).
Experimental design of motion stimulation
For each participant, the finger, forearm, and head holders were adjusted to ensure between-participant consistency of finger and head postures. For the intensity of tactile stimulation, tactile motion was presented by the miniature tactile motion stimulator54 (Fig. 1a,b) with indentation depth of 1000 μm, and had the indentation rate of 2 mm/s which was far beyond the sensation threshold of indentation rate (>0.3 mm/s)78. In this setup, the discriminability of stimulus direction54 was 11.4° ± 2.5°. Tactile motion was presented with a speed of 40 mm/s using a square wave grating ball with a wavelength of 2 mm79 and temporal frequency of 10 Hz80, both of which were beyond the threshold of tactile perception13.
At the beginning of each trial, the participant visually fixated on a cross presented at the center of the video display. After 1 second fixation, the rotating ball was indented on the index fingertip with indentation depth of 1 mm. Tactile stimulation was then applied for 1 second in one of the 24 different directions (0° to 345° in 15° increments) at a speed of 40 mm/s. The ball was then withdrawn from the fingerpad (Fig. 2a). After the ball was moved away, the participant reported the perceived direction of motion (\({R}_{i}\)) using a mouse to click on an appropriate point on a circle shown on the video display. After that, a gray blank appeared on the screen for 1 second (Fig. 2a left). After the participant clicked on the circle, the screen was presented in gray blank for a period of 1 second and then shown again for the following trial.
The stimulation trials were performed in accordance with a 3-by-4 factorial finger-and-head posture combination design consisting of three head postures and four finger postures. In particular, the finger postures (θF) were set as 90°, 60°, 30° or 0°, while the head postures (θH) were set as 120°, 90° or 60°; yielding a total of 12 different posture combinations (Fig. 2b). Note that parameters θH and θF were both defined on the allocentric reference frame. A total of 96 trials were performed for each posture combination (24 directions * 2 times * 2 blocks).
Definition of bias
The perceptual bias (\({B}_{i}\)) was quantified as the difference between the reported stimulus direction (\({R}_{i}\)) and the veridical direction (\({V}_{i}\)) on the allocentric reference frame (\({B}_{i}\) = \({R}_{i}\)– \({V}_{i}\)). For each posture combination, the systematic bias, \(S\), was computed as the mean of the perceptual biases across all 24 directions, i.e., \(S=\frac{1}{24}{\sum }_{i=1}^{24}{B}_{i}\). In addition, the nonsystematic bias, \(N{S}_{i},\) for the veridical direction was computed as the difference between the perceptual bias and the systematic bias, i.e., \(N{S}_{i}={B}_{i}-S\).
Fitting for nonsystematic bias
The nonsystematic bias was fitted as a function of the veridical direction using a cosine function with moment = 2 (Fig. S1) that yields two full oscillations in one complete direction cycle, i.e., \(N{S}_{i}=A\,\cos (2{V}_{i}-2{\theta }_{p}),\) where \({\rm{A}}\) and \({{\rm{\theta }}}_{{\rm{p}}}\) represent the amplitude and phase corresponding to the maximal nonsystematic bias and are free parameters determined using the least-squares fitting method.
Statistical analysis
The perceptual bias, the difference between the perceived and veridical directions on the allocentric reference frame, was first computed for each trial. For each participant, the perceptual bias for each direction of motion was the circular mean of perceptual bias across the four repetitions. The systematic bias for each posture was the mean across stimulus directions (Fig. 3). The amplitude (A) and phase (θp) of the nonsystematic bias were retrieved from cosine fit to the nonsystematic bias at its peak position. To evaluate the relationship between postures and biases, we applied Pearson’s correlation (simple regression model) to compute the correlation between posture angles to the parameters of systematic (Fig. 4) or nonsystematic bias (Fig. 6). For each of the three posture conditions, θF, θH, and θF-H, we applied Student’s t test to compare the values of regression coefficients (R2) between systematic and nonsystematic biases to evaluate the degree to which the two biases was modulated by the postures (Fig. 7). To examine whether there was inconsistency of goodness-of-fit (R2) of cosine fit across postures, we applied repeated-measures ANOVA for each of the finger posture (θF), head (θH), and finger-head postures (θF-H) (Fig. S1b). We used Pearson’s correlation to examine whether nonsystematic bias is a function of finger posture on the allocentric reference frame (Fig. S3), and, finally, to evaluate the relationship between systematic bias and the phase of nonsystematic bias for each participant (Fig. S4).
Change history
04 June 2020
An amendment to this paper has been published and can be accessed via a link at the top of the paper.
References
Rainer, A. & Hall, T. An analysis of some ‘core studies’ of software process improvement. Softw. Process: Improvement Pract. 6, 169–187, https://doi.org/10.1002/spip.147 (2001).
Farne, A., Pavani, F., Meneghello, F. & Ladavas, E. Left tactile extinction following visual stimulation of a rubber hand. Brain 123(Pt 11), 2350–2360, https://doi.org/10.1093/brain/123.11.2350 (2000).
Graziano, M. S. & Gross, C. G. The representation of extrapersonal space: a possible role for bimodal, visual-tactile neurons. The cognitive neurosciences, 1021–1034 (1995).
Haggard, P., Taylor-Clarke, M. & Kennett, S. Tactile perception, cortical representation and the bodily self. Curr. Biol. : CB 13, R170–173, https://doi.org/10.1016/s0960-9822(03)00115-5 (2003).
Holmes, N. P. & Spence, C. The body schema and the multisensory representation(s) of peripersonal space. Cogn. Process. 5, 94–105, https://doi.org/10.1007/s10339-004-0013-3 (2004).
Lederman, S. J. & Klatzky, R. L. Hand movements: a window into haptic object recognition. Cogn. Psychol. 19, 342–368, https://doi.org/10.1016/0010-0285(87)90008-9 (1987).
Dreyer, D. A., Hollins, M. & Whitsel, B. L. Factors influencing cutaneous directional sensitivity. Sens. Process. 2, 71–79 (1978).
Essick, G. K., Franzen, O. & Whitsel, B. L. Discrimination and scaling of velocity of stimulus motion across the skin. Somatosens. Mot. Res. 6, 21–40, https://doi.org/10.3109/08990228809144639 (1988).
Norrsell, U. & Olausson, H. Human, tactile, directional sensibility and its peripheral origins. Acta Physiol. Scand. 144, 155–161, https://doi.org/10.1111/j.1748-1716.1992.tb09280.x (1992).
Gardner, E. P. & Sklar, B. F. Discrimination of the direction of motion on the human hand: a psychophysical study of stimulation parameters. J. Neurophysiol. 71, 2414–2429, https://doi.org/10.1152/jn.1994.71.6.2414 (1994).
Keyson, D. V. & Houtsma, A. J. Directional sensitivity to a tactile point stimulus moving across the fingerpad. Percept. Psychophys. 57, 738–744, https://doi.org/10.3758/BF03213278 (1995).
Bensmaia, S. J., Killebrew, J. H. & Craig, J. C. Influence of visual motion on tactile motion perception. J. Neurophysiol. 96, 1625–1637, https://doi.org/10.1152/jn.00192.2006 (2006).
Depeault, A., Meftah el, M. & Chapman, C. E. Tactile speed scaling: contributions of time and space. J. Neurophysiol. 99, 1422–1434, https://doi.org/10.1152/jn.01209.2007 (2008).
Gentilucci, M., Toni, I., Daprati, E. & Gangitano, M. Tactile input of the hand and the control of reaching to grasp movements. Exp. Brain Res. 114, 130–137, https://doi.org/10.1007/pl00005612 (1997).
Powell, T. P. & Mountcastle, V. B. Some aspects of the functional organization of the cortex of the postcentral gyrus of the monkey: a correlation of findings obtained in a single unit analysis with cytoarchitecture. Bull. Johns. Hopkins Hosp. 105, 133–162 (1959).
Kaas, J. H., Nelson, R. J., Sur, M., Lin, C. S. & Merzenich, M. M. Multiple representations of the body within the primary somatosensory cortex of primates. Science 204, 521–523, https://doi.org/10.1126/science.107591 (1979).
Tolman, E. C. Cognitive maps in rats and men. Psychol. Rev. 55, 189–208, https://doi.org/10.1037/h0061626 (1948).
Hammerschmidt, O. Über die Genauigkeit der haptischen Verwirklichung geometrischer Grundbegriffe (1934).
Rinker, M. A. & Craig, J. C. The effect of spatial orientation on the perception of moving tactile stimuli. Percept. Psychophys. 56, 356–362, https://doi.org/10.3758/BF03209769 (1994).
Romano, D., Marini, F. & Maravita, A. Standard body-space relationships: Fingers hold spatial information. Cognition 165, 105–112, https://doi.org/10.1016/j.cognition.2017.05.014 (2017).
Azanon, E., Stenner, M. P., Cardini, F. & Haggard, P. Dynamic tuning of tactile localization to body posture. Curr. Biol. : CB 25, 512–517, https://doi.org/10.1016/j.cub.2014.12.038 (2015).
Schlicht, E. J. & Schrater, P. R. Impact of coordinate transformation uncertainty on human sensorimotor control. J. Neurophysiol. 97, 4203–4214, https://doi.org/10.1152/jn.00160.2007 (2007).
Pritchett, L. M., Carnevale, M. J. & Harris, L. R. Reference frames for coding touch location depend on the task. Exp. Brain Res. 222, 437–445, https://doi.org/10.1007/s00221-012-3231-4 (2012).
Heed, T., Buchholz, V. N., Engel, A. K. & Roder, B. Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends Cogn. Sci. 19, 251–258, https://doi.org/10.1016/j.tics.2015.03.001 (2015).
Heed, T., Backhaus, J., Roder, B. & Badde, S. Disentangling the External Reference Frames Relevant to Tactile Localization. PLoS One 11, e0158829, https://doi.org/10.1371/journal.pone.0158829 (2016).
Tame, L., Azanon, E. & Longo, M. R. A Conceptual Model of Tactile Processing across Body Features of Size, Shape, Side, and Spatial Location. Front. Psychol. 10, 291, https://doi.org/10.3389/fpsyg.2019.00291 (2019).
Longo, M. R., Azanon, E. & Haggard, P. More than skin deep: body representation beyond primary somatosensory cortex. Neuropsychologia 48, 655–668, https://doi.org/10.1016/j.neuropsychologia.2009.08.022 (2010).
Blake, R., Sobel, K. V. & James, T. W. Neural synergy between kinetic vision and touch. Psychol. Sci. 15, 397–402, https://doi.org/10.1111/j.0956-7976.2004.00691.x (2004).
Soto-Faraco, S., Spence, C., Lloyd, D. & Kingstone, A. Moving multisensory research along: Motion perception across sensory modalities. Curr. Directions Psychological Sci. 13, 29–32 (2004).
Azanon, E. & Soto-Faraco, S. Changing reference frames during the encoding of tactile events. Curr. Biol. : CB 18, 1044–1049, https://doi.org/10.1016/j.cub.2008.06.045 (2008).
Konkle, T., Wang, Q., Hayward, V. & Moore, C. I. Motion aftereffects transfer between touch and vision. Curr. Biol. : CB 19, 745–750, https://doi.org/10.1016/j.cub.2009.03.035 (2009).
Tame, L., Wuhle, A., Petri, C. D., Pavani, F. & Braun, C. Concurrent use of somatotopic and external reference frames in a tactile mislocalization task. Brain Cogn. 111, 25–33, https://doi.org/10.1016/j.bandc.2016.10.005 (2017).
Kuroki, S., Watanabe, J., Kawakami, N., Tachi, S. & Nishida, S. Somatotopic dominance in tactile temporal processing. Exp. Brain Res. 203, 51–62, https://doi.org/10.1007/s00221-010-2212-8 (2010).
Cohen, Y. E. & Andersen, R. A. A common reference frame for movement plans in the posterior parietal cortex. Nat. reviews. Neurosci. 3, 553–562, https://doi.org/10.1038/nrn873 (2002).
Womelsdorf, T. & Fries, P. Neuronal coherence during selective attentional processing and sensory-motor integration. J. Physiol. Paris. 100, 182–193, https://doi.org/10.1016/j.jphysparis.2007.01.005 (2006).
Corcoran, D. W. The phenomena of the disembodied eye or is it a matter of personal geography? Perception 6, 247–253, https://doi.org/10.1068/p060247 (1977).
Matthews, P. B. Proprioceptors and their contribution to somatosensory mapping: complex messages require complex processing. Can. J. Physiol. Pharmacol. 66, 430–438, https://doi.org/10.1139/y88-073 (1988).
Ribot-Ciscar, E., Bergenheim, M., Albert, F. & Roll, J. P. Proprioceptive population coding of limb position in humans. Exp. Brain Res. 149, 512–519, https://doi.org/10.1007/s00221-003-1384-x (2003).
Kalaska, J. F., Cohen, D. A., Prud’homme, M. & Hyde, M. L. Parietal area 5 neuronal activity encodes movement kinematics, not movement dynamics. Exp. Brain Res. 80, 351–364, https://doi.org/10.1007/bf00228162 (1990).
Prud’homme, M. J. & Kalaska, J. F. Proprioceptive activity in primate primary somatosensory cortex during active arm reaching movements. J. Neurophysiol. 72, 2280–2301, https://doi.org/10.1152/jn.1994.72.5.2280 (1994).
Tillery, S. I., Soechting, J. F. & Ebner, T. J. Somatosensory cortical activity in relation to arm posture: nonuniform spatial tuning. J. Neurophysiol. 76, 2423–2438, https://doi.org/10.1152/jn.1996.76.4.2423 (1996).
Clark, F. J., Larwood, K. J., Davis, M. E. & Deffenbacher, K. A. A metric for assessing acuity in positioning joints and limbs. Exp. Brain Res. 107, 73–79, https://doi.org/10.1007/bf00228018 (1995).
Darling, W. G. Perception of forearm angles in 3-dimensional space. Exp. Brain Res. 87, 445–456, https://doi.org/10.1007/bf00231862 (1991).
Gritsenko, V., Krouchev, N. I. & Kalaska, J. F. Afferent input, efference copy, signal noise, and biases in perception of joint angle during active versus passive elbow movements. J. Neurophysiol. 98, 1140–1154, https://doi.org/10.1152/jn.00162.2007 (2007).
Kappers, A. M. & Koenderink, J. J. Haptic perception of spatial relations. Perception 28, 781–795, https://doi.org/10.1068/p2930 (1999).
Kappers, A. M. Large systematic deviations in the haptic perception of parallelity. Perception 28, 1001–1012, https://doi.org/10.1068/p281001 (1999).
Kappers, A. M. Haptic perception of parallelity in the midsagittal plane. Acta Psychol. 109, 25–40, https://doi.org/10.1016/S0001-6918(01)00047-6 (2002).
Hermens, F., Kappers, A. M. & Gielen, S. C. The structure of frontoparallel haptic space is task dependent. Percept. Psychophys. 68, 62–75, https://doi.org/10.3758/BF03193656 (2006).
Kappers, A. M. The contributions of egocentric and allocentric reference frames in haptic spatial tasks. Acta Psychol. 117, 333–340, https://doi.org/10.1016/j.actpsy.2004.08.002 (2004).
Volcic, R., Kappers, A. M. & Koenderink, J. J. Haptic parallelity perception on the frontoparallel plane: the involvement of reference frames. Percept. Psychophys. 69, 276–286, https://doi.org/10.3758/BF03193749 (2007).
Volcic, R., Wijntjes, M. W. & Kappers, A. M. Haptic mental rotation revisited: multiple reference frame dependence. Acta Psychol. 130, 251–259, https://doi.org/10.1016/j.actpsy.2009.01.004 (2009).
Bensmaia, S. J. Tactile intensity and population codes. Behav. Brain Res. 190, 165–173, https://doi.org/10.1016/j.bbr.2008.02.044 (2008).
Pei, Y. C., Hsiao, S. S. & Bensmaia, S. J. The tactile integration of local motion cues is analogous to its visual counterpart. Proc. Natl Acad. Sci. USA 105, 8130–8135, https://doi.org/10.1073/pnas.0800028105 (2008).
Pei, Y. C. et al. A multi-digit tactile motion stimulator. J. Neurosci. methods 226, 80–87, https://doi.org/10.1016/j.jneumeth.2014.01.021 (2014).
Bensmaia, S. J., Denchev, P. V., Dammann, J. F. III., Craig, J. C. & Hsiao, S. S. The representation of stimulus orientation in the early stages of somatosensory processing. J. Neurosci. : Off. J. Soc. Neurosci. 28, 776–786, https://doi.org/10.1523/JNEUROSCI.4162-07.2008 (2008).
Carter, O., Konkle, T., Wang, Q., Hayward, V. & Moore, C. Tactile rivalry demonstrated with an ambiguous apparent-motion quartet. Curr. Biol. : CB 18, 1050–1054, https://doi.org/10.1016/j.cub.2008.06.027 (2008).
Volcic, R., van Rheede, J. J., Postma, A. & Kappers, A. M. Differential effects of non-informative vision and visual interference on haptic spatial processing. Exp. Brain Res. 190, 31–41, https://doi.org/10.1007/s00221-008-1447-0 (2008).
Carrozzo, M. & Lacquaniti, F. A hybrid frame of reference for visuo-manual coordination. Neuroreport 5, 453–456, https://doi.org/10.1097/00001756-199401120-00021 (1994).
Flanders, M. & Soechting, J. F. Frames of reference for hand orientation. J. Cogn. Neurosci. 7, 182–195, https://doi.org/10.1162/jocn.1995.7.2.182 (1995).
Soechting, J. F. & Flanders, M. Moving in three-dimensional space: frames of reference, vectors, and coordinate systems. Annu. Rev. Neurosci. 15, 167–191, https://doi.org/10.1146/annurev.ne.15.030192.001123 (1992).
Soechting, J. F. & Flanders, M. Parallel, interdependent channels for location and orientation in sensorimotor transformations for reaching and grasping. J. Neurophysiol. 70, 1137–1150, https://doi.org/10.1152/jn.1993.70.3.1137 (1993).
Luyat, M., Gentaz, E., Corte, T. R. & Guerraz, M. Reference frames and haptic perception of orientation: body and head tilt effects on the oblique effect. Percept. Psychophys. 63, 541–554, https://doi.org/10.3758/BF03194419 (2001).
Stein, B. E. & Stanford, T. R. Multisensory integration: current issues from the perspective of the single neuron. Nat. reviews. Neurosci. 9, 255–266, https://doi.org/10.1038/nrn2331 (2008).
Millar, S. & Al-Attar, Z. External and body-centered frames of reference in spatial memory: evidence from touch. Percept. Psychophys. 66, 51–59, https://doi.org/10.3758/BF03194860 (2004).
Essock, E. A., Krebs, W. K. & Prather, J. R. An anisotropy of human tactile sensitivity and its relation to the visual oblique effect. Exp. Brain Res. 91, 520–524, https://doi.org/10.1007/bf00227848 (1992).
Gentaz, E. & Hatwell, Y. The haptic ‘oblique effect’ in children’s and adults’ perception of orientation. Perception 24, 631–646, https://doi.org/10.1068/p240631 (1995).
Pei, Y. C., Hsiao, S. S., Craig, J. C. & Bensmaia, S. J. Shape invariant coding of motion direction in somatosensory cortex. PLoS Biol. 8, e1000305, https://doi.org/10.1371/journal.pbio.1000305 (2010).
Warren, S., Hamalainen, H. A. & Gardner, E. P. Objective classification of motion- and direction-sensitive neurons in primary somatosensory cortex of awake monkeys. J. Neurophysiol. 56, 598–622, https://doi.org/10.1152/jn.1986.56.3.598 (1986).
Phillips, J. R. & Johnson, K. O. Tactile spatial resolution. II. Neural representation of Bars, edges, and gratings in monkey primary afferents. J. Neurophysiol. 46, 1192–1203, https://doi.org/10.1152/jn.1981.46.6.1192 (1981).
Jay, M. F. & Sparks, D. L. Auditory receptive fields in primate superior colliculus shift with changes in eye position. Nature 309, 345–347, https://doi.org/10.1038/309345a0 (1984).
Russo, G. S. & Bruce, C. J. Frontal eye field activity preceding aurally guided saccades. J. Neurophysiol. 71, 1250–1253, https://doi.org/10.1152/jn.1994.71.3.1250 (1994).
Stein, B. E. M., M.A. The Merging of the Senses. (Bradford, 1993).
Avillac, M., Deneve, S., Olivier, E., Pouget, A. & Duhamel, J. R. Reference frames for representing visual and tactile locations in parietal cortex. Nat. Neurosci. 8, 941–949, https://doi.org/10.1038/nn1480 (2005).
Duhamel, J. R., Bremmer, F., Ben Hamed, S. & Graf, W. Spatial invariance of visual receptive fields in parietal cortex neurons. Nature 389, 845–848, https://doi.org/10.1038/39865 (1997).
Stricanne, B., Andersen, R. A. & Mazzoni, P. Eye-centered, head-centered, and intermediate coding of remembered sound locations in area LIP. J. Neurophysiol. 76, 2071–2076, https://doi.org/10.1152/jn.1996.76.3.2071 (1996).
Graziano, M. S., Yap, G. S. & Gross, C. G. Coding of visual space by premotor neurons. Science 266, 1054–1057, https://doi.org/10.1126/science.7973661 (1994).
Lederman, S. J. & Klatzky, R. L. Extracting object properties through haptic exploration. Acta Psychol. 84, 29–40, https://doi.org/10.1016/0001-6918(93)90070-8 (1993).
Greenspan, J. D., Kenshalo, D. R. Sr. & Henderson, R. The influence of rate of skin indentation on threshold and suprathreshold tactile sensations. Somatosens. Res. 1, 379–393, https://doi.org/10.3109/07367228409144556 (1984).
Johansson, R. S. & Vallbo, Å. B. Tactile sensory coding in the glabrous skin of the human hand. Trends Neurosci. 6, 27–32, https://doi.org/10.1016/0166-2236(83)90011-5 (1983).
Ferrington, D. G., Nail, B. S. & Rowe, M. Human tactile detection thresholds: modification by inputs from specific tactile receptor classes. J. Physiol. 272, 415–433, https://doi.org/10.1113/jphysiol.1977.sp012052 (1977).
Acknowledgements
The authors thank Ting-Yu Chen for her technical assistance throughout this study, Yu-Shan Yeh for her help in data collection and Shen-Shiou Tseng for his help in test-retest data collection. The study was jointly supported by the Taiwan National Science Council (Grant NSC-99-2321-B-182A-004), the National Health Research Institutes (Grant NHRI-EX101-10113EC), Chang Gung Medical Foundation (Grant CMRPG590021G, for psychophysical experiments; Grant CMRPG3C0463, for data analysis; and Grant CMRPG5D01061, for instrument development); and the Healthy Aging Research Center (Grant EMRPD1C0291). The funders had no role in the study design; data collection and analysis; decision to publish; or preparation of the manuscript. The authors have no other source of funding, or financial relationships, to disclose.
Author information
Authors and Affiliations
Contributions
Y.-P.C., C.-I.Y., T.-C.L., J.-J.H., and Y.-C.P. designed the research; T.-C.L. and J.-J.H. collected data; Y.-P.C., C.-I.Y., and Y.-C.P. analyzed data; Y.-P.C., C.-I.Y., T.-C.L., J.-J.H., and Y.-C.P. wrote the paper.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Chen, YP., Yeh, CI., Lee, TC. et al. Relative posture between head and finger determines perceived tactile direction of motion. Sci Rep 10, 5494 (2020). https://doi.org/10.1038/s41598-020-62327-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-020-62327-x
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.