A kinematic and EMG dataset of online adjustment of reach-to-grasp movements to visual perturbations

Control of reach-to-grasp movements for deft and robust interactions with objects requires rapid sensorimotor updating that enables online adjustments to changing external goals (e.g., perturbations or instability of objects we interact with). Rarely do we appreciate the remarkable coordination in reach-to-grasp, until control becomes impaired by neurological injuries such as stroke, neurodegenerative diseases, or even aging. Modeling online control of human reach-to-grasp movements is a challenging problem but fundamental to several domains, including behavioral and computational neuroscience, neurorehabilitation, neural prostheses, and robotics. Currently, there are no publicly available datasets that include online adjustment of reach-to-grasp movements to object perturbations. This work aims to advance modeling efforts of reach-to-grasp movements by making publicly available a large kinematic and EMG dataset of online adjustment of reach-to-grasp movements to instantaneous perturbations of object size and distance performed in immersive haptic-free virtual environment (hf-VE). The presented dataset is composed of a large number of perturbation types (10 for both object size and distance) applied at three different latencies after the start of the movement.

data collected under a limited set of task manipulations-have remained untested. A reach-to-grasp dataset that offers synchronized kinematic and electromyography (EMG) data for a broader set of conditions, including coordinated reach and grasp responses to perturbations of the task goal, would greatly aid future efforts directed toward modeling of reach-to-grasp movements.
The purpose of this report is to make publicly available rich dataset of kinematic and EMG collected during hand and arm movements as participants reached to grasp objects in an immersive haptic-free virtual environment (henceforth, hf-VE) that includes a large variety of object size and distance perturbations that required rapid online adjustments of movement to compensate for instantaneous perturbations of the goal. Moreover, in the interest of being able to study the extent to which there may be temporal dependencies related to the perturbation timing, the perturbations occurred at three different latencies after movement onset. This dataset has been collected using the state-of-the-art experimental setup developed over several years in the Movement Neuroscience Laboratory at Northeastern University, which includes the seamless integration of an immersive haptic-free virtual reality system, an active marker motion capture system, a wireless multichannel electromyography (EMG) recording system, and an immersive unity 3D-programmed hf-VE programmed in C# and Python. The hardware and software renderings were synchronized to obtain a ~13.3 ms feedback loop between participants' hand movements and their virtual rendering corresponding to 75 Hz sampling of kinematic data (all kinematic data is provided after resampling to 100 Hz). The kinematic data primarily pertain to the transport and aperture aspects of reach-to-grasp movements. Whereas 'transport' refers to the motion of the hand towards the target object, aperture refers to the distance between the tips of the thumb and index finger that forms the enclosure around the target object. The dataset is organized as a Matlab (Mathworks Inc., Natick, MA) data structure (.mat) with kinematic and EMG data. The dataset's novelty lies in a large number of conditions, including Perturbation Type (object size perturbation, object distance perturbation), Perturbation Timing (100 ms, 200 ms, and 300 ms after movement onset), and the combination of synchronized kinematic and EMG acquisition.
To our knowledge, no dataset exists for online adjustment of coordinated reach-to-grasp movements to perturbations of the task goal. Although some reach and grasp kinematics datasets (e.g., performing different reaches and grasps) are available, as well as forearm EMG datasets (e.g., performing hand gestures or freehand movements), they have several limitations for use in modeling online control of reach-to-grasp movements: • A focus on isolated grasp: The available datasets focus solely on the grasp component with little to no mention of the reach component, or coordinated reach-to-grasp movements [33][34][35][36][37][38][39] . • Limited sample size: Some of the existing datasets provide data only from a small number of participants (e.g., just one to four participants 35,40 , as opposed to a total of 20 participants in the present dataset, ten participants each for object size and distance perturbations), limiting their generalizability and the ability of modeling efforts to make generalizable inferential predictions. • No synchronization between kinematic and EMG data: The available datasets offer either kinematics 38,39 or EMG 41,42 data but do not offer synchronized kinematic and EMG data.
This dataset overcomes the above-mentioned limitations. It is our hope that the data will be useful for modeling coordinated reach-to-grasp movements for both the basic and applied aspects of research. The present dataset consists of a Matlab/GNU Octave data structure (in.mat) with kinematic and EMG data (including maximal voluntary contraction or MVC for each muscle from which EMG was recorded). A separate.csv file contains sex, age, anthropometry data and laterality for all participants.

Methods
participants and ethical requirements. Ten adults (eight men and two women; mean ± 1s.d. age: 22.5 ± 6.0 years, right-handed) participated in the size-perturbation study, and ten adults (eight men and two women; mean ± 1s.d. age: 25.3 ± 6.4 years, right-handed) participated in the distance-perturbation study. The participants were free of any muscular, orthopedic, or neurological health concerns. The participant pool comprised undergraduate and graduate students at Northeastern University. The participants were offered $10 per hour for participation. Each participant provided verbal and written consent approved by the Institutional Review Board (IRB) at Northeastern University. Some participants had previously participated in reach-to-grasp studies in our hf-VE, however, none of the participants reported extensive experience in virtual reality (e.g., gaming, simulations, etc.). To ensure adequate familiarization with the reach-to-grasp task in a virtual environment, all participants completed a training block of 120 reach-to-grasp trials [24 trials per object size: small (w × h × d = 3.5 × 8 × 2.5 cm), small-medium (4.5 × 8 × 2.5 cm), medium (5.5 × 8 × 2.5 cm), medium-large (6.5 × 8 × 2.5 cm), and large (7.5 × 8 × 2.5 cm) object placed at 30 cm; or 24 trials per object distance: medium object placed at near (20 cm), near-middle (25 cm), middle (30 cm), middle-far (35 cm), and far (40 cm) distances]. If participants felt comfortable after 60 trials, the training block was terminated, and the experimental trials began; otherwise, the participants completed all 120 training trials.
Reach-to-grasp task, virtual environment, and kinematic/kinetic measurement. Participants reached to grasp virtual objects of different sizes and placed at different distances from the starting position in an immersive hf-VE developed in UNITY (ver. 5.6.1f1, 64-bit, Unity Technologies, San Francisco, CA) and delivered via an Oculus head-mounted display (HMD; Rift DK2, Oculus Inc., Menlo Park, CA; Fig. 1) using HANDoVR (Movement Neuroscience Laboratory, Northeastern University, Boston, MA). An eight-camera motion tracking system (sampling rate: 75 Hz; PPT Studio N TM , WorldViz Inc., Santa Barbara, CA) recorded the 3D motion of IRED markers attached to the participants' wrist (at the center of the segment running between the ulnar and radial styloid processes), and the tips of the thumb and index finger. A pair of IRED markers were attached to the HMD to co-register the participant's head motion to the virtual environment. Participants viewed the thumb and index fingertips as two 3D spheres (green in color, 0.8 cm diameter) in the hf-VE, reflecting the 3D position of the respective IRED marker. The schedule of trials, virtual renderings of objects, and timing/triggering of perturbations were controlled using custom software developed in C# and Python. EMG recordings. EMG activity (in µV) was recorded from the following ten muscles of each participant's shoulder, arm and hand on the dominant right side. EMG was acquired from the first dorsal interosseous (FDI), flexor digitorum superficialis (FDS), extensor digitorum communis (EDC), extensor indicis (EI), abductor pollicis brevis (APB), extensor pollicis brevis (EPB), biceps brachii (BB), triceps brachii (TB), anterior deltoid (AD), and posterior deltoid (PD).
EMG was recorded using a Delsys TrignoTM wireless EMG system (sampling rate: 1 kHz; Delsys Inc., Natick, MA). Surface EMG sensor bars were attached perpendicular to the muscle fibers over the muscle belly. Excess hair was shaved, and the skin prepped/cleaned with isopropyl alcohol pads before attaching the sensors to reduce skin impedance. Proper positioning of EMG sensors was ensured by physically palpating the muscle during sustained isometric contraction and visual confirmation of the EMG signal. EMG activity during MVC was saved and is included in the dataset. Kinematic and EMG data were synchronized using a 5V digital output (10 ms) sent from Unity and recorded as an analog signal synchronously with EMG. Fig. 1 Using the pincer grip participants reached to grasp virtual objects of different sizes and placed at different distances from the initial position of their thumb and index finger, in an immersive haptic-free virtual environment delivered via Oculus head-mounted display. Instantaneous perturbations of object size and distance were randomly applied at 100 ms, 200 ms, or 300 ms (i.e., the moment the start switch-depicted by the solid yellow circle-was released). www.nature.com/scientificdata www.nature.com/scientificdata/ synchronization between EMG data and kinematics data. Details of the synchronization between EMG data and kinematics data are as follows: EMG data were collected using custom software in Matlab to communicate with a Multifunctional I/O Device (NI6255; National Instruments Inc., Austin, TX). Analog data from the Delsys wireless EMG system were streamed to the NI6255 with a known constant 47 ms delay. Kinematic data were collected using C# and Unity-based HANDoVR, described above. HANDoVR software communicated with a National Instrument Multifunctional I/O Device (NI6211). Upon detecting switch release, HANDoVR triggered a 5 V digital (10 ms) output from the NI6211. This digital output was connected to an analog input channel on the NI6255 and recorded into Matlab. EMG and Kinematic data sets were aligned via the start switch trigger recorded digitally in HANDoVR, and the analog reading of the digital output sent from HANDoVR to Matlab. Misalignment of the kinematic data and EMG was constrained to the sampling period of the kinematic data (~11.1 ms), as we were unable to estimate when in the inter-sample period the digital output was sent with respect to when the motion capture sensors were read. Hardware delays were tested to be less than the sampling period of the EMG recording (1 ms).

Condition name No perturbation Object size S [cm] Object distance D [cm] # Trials
Schedule of trials and visual perturbations of object size and distance. Participants were tested in a single experimental session that lasted up to 180 min. Before data collection, participants were allowed to practice the reach-to-grasp task (unperturbed) until they felt comfortable with the task. Then the experiment began, consisting of a total of 960 reach-to-grasp trials, each trial lasting 3.5 s (480 no-perturbation and 480 perturbation trials). The trials were conducted over four sessions of 240 trials each (120 no-perturbation and 120 perturbation   Table 3. Order of kinematic features (from top to bottom).

Fig. 2
Mean temporal profiles of transport and aperture kinematics for the control (no perturbation) and a size/ ditsance-perturbation condition (perturbation applied at 300 ms after movement onset) for a representative participant. Blue arrows indicate the kinematic features listed in Table 3  www.nature.com/scientificdata www.nature.com/scientificdata/ trials). The order of no-perturbation and perturbation trials were randomized differently in each session. We ensured that each type of no-perturbation and perturbation trial was evenly distributed across the four sessions. A 5-min break was given after each block and whenever the participant expressed a need to do so.
Size perturbation: Table 1 tabulates the breakdown of 960 trials.
The 480 perturbation trials were evenly distributed among ten possible combinations of object size changes such that the object's width increased from the object's initial size to a larger size (48 trials per perturbation type). The perturbation types included: small (S) to small-medium (SM), small to medium (M), small to medium-large (ML), small to large (L), small-medium to medium, small-medium to medium-large, small-medium to large, medium to medium-large, medium to large, and medium-large to large (all perturbations from smaller to larger objects). Each perturbation type was applied at three different latencies: 100 ms after movement onset, 200 ms after movement onset, and 300 ms after movement onset, resulting in 16 trials for each perturbation type applied at each of the three timings. The 16 trials for each perturbation type and timing were evenly distributed across the four blocks (four trials per block).
The 480 perturbation trials were evenly distributed among ten possible combinations of object distance changes such that the object's distance increased from the object's initial location to a farther location (48 trials per perturbation type). The perturbation types included: near (N) to near-middle (NM), near to middle (M), near to middle-far (MF), near to far (F), near-middle to middle, near-middle to middle-far, near-middle to far, middle to middle-far, middle to far, and middle-far to far (all permutations from closer to farther distances). Each perturbation type was applied at three different latencies: 100 ms after movement onset, 200 ms after movement onset, and 300 ms after movement onset, resulting in 16 trials for each perturbation type applied at each of the three timings. The 16 trials for each perturbation type and timing were evenly distributed across the four blocks (four trials per block). The reach-to-grasp animation of representative conditions (control, size and distance perturbations with 100 and 300 ms latencies) is available on the Figshare 43 .
Procedures and instructions to participants. Each participant was seated in a chair with their right hand placed on a table in front of them (Fig. 1). At the start position, the thumb and index finger straddled a 1.5 cm wide wooden peg located 12 cm in front and 24 cm to the right of the sternum, with the thumb depressing a start switch. Lifting the thumb off the switch marked movement onset. A digital transistor-transistor logic (TTL) connected to the start switch was used to synchronize kinematic and EMG recordings. In each trial, the following events occurred: (1) Participants depressed the start switch to begin the trial. (2) The object appeared in hf-VE, oriented at a 75° angle along the vertical axis to minimize excessive wrist extension during reach-to-grasp. (3) After 1 s, an auditory cue-a beep-signaled the participants to reach for, grasp, and lift the object with 1.2 cm combined error margin 44 . The object was considered to have been grasped when both 3D spheres (reflecting the tips of the thumb and index finger) had come in contact with the lateral surfaces of the virtual object. (4) Once grasp of the virtual www.nature.com/scientificdata www.nature.com/scientificdata/ object was detected, the object changed color from blue to red and a 'click' sound was presented. (5) Participants lifted and raised each object briefly before returning their hand to the starting position, after which the next trial began.
Instructions to the participant were: "Each trial will start once the thumb depresses the start switch (the correct initial position of the hand was demonstrated). Following the beep, reach-to and grasp the narrow sides of the object between the thumb and index finger using a pincer grip (demonstrated). When the object is grasped, it will turn from blue to red and a 'click' sound will be presented. Lift the object briefly until the object disappears, and return your hand to the start position. On some trials, the object may change "size" or "position", requiring you to adjust your movements. A break would be provided after each block of 240 trials but you may rest at any point between trials within a block by not depressing the start switch to begin the next trial. Do you have any questions?". Data processing and kinematic feature extraction. The raw data included the x, y, and z marker positions of the wrist, thumb, and index finger positions with associated timestamps (75 Hz; this raw data is not provided in the data records). All position data were analyzed offline using custom Matlab codes. The time-series data for each trial were cropped from movement onset (the moment the switch was released) to movement offset (the moment the collision detection criteria were met) and resampled at 100 Hz using the interp1() function in Matlab. Transport distance (the straight-line distance of the wrist marker from the starting position in the transverse plane) and aperture (the straight-line distance between the thumb and index finger markers in the transverse plane) were computed for each trial. The first and second derivatives of transport displacement and aperture were computed to obtain the velocity and acceleration profiles for kinematic feature extraction. A 6 Hz, fourth-order low-pass Butterworth filter was applied on all time-series. Trails in which participants did not move, were delayed in moving, or had inappropriate movements were excluded from the database (i.e., bad trials). Link to the GitHub repository of custom code used to generate the data is available under Code Availability section. www.nature.com/scientificdata www.nature.com/scientificdata/ For each trial, the following kinematic features, units in parentheses (Table 3), were extracted using the filtered time series data (Fig. 2):  Before data collection, the maximal voluntary contraction (MVC) of each muscle was obtained. Muscle activation was recorded during each reach-to-grasp movement for 3.5 s. The data files accompanying this dataset contain MVCs and raw EMG beginning 500 ms before movement onset. EMG data from one participant (P6size perturbation only) were not saved correctly due to technical issues.

Data Records
All data is made available using Figshare 45 . All 10 participants for each type of perturbation have been identified using alphanumeric format P# in the folders "SizePert" and "DistPert" for object size and distance perturbation, respectively. The deidentified participant information (sex, age, body mass, and height) is stored in the Excel file named "Participants". Kinematic and EMG data have been grouped into subject-specific folders, each folder bearing the participant's alphanumeric code (e.g., P5 for the fifth participant). Within each participant folder, there are five.mat files: 1) Raw_Data; 2) Resampled data; 2) Kinematic profiles of position, velocity, and acceleration for both the transport and aperture; 3) Kinematic features; 5) MVC and raw EMG. Figure 3 illustrates the Matlab structure in which files 2-5 are saved, with the row and column vectors shown in red (also see Tables 3 and 4). • Raw_Data.mat-this file contains nine arrays: • Trial_Number-1 × 960 cell array with each cell containing the trial number of the respective trial in the same order in which that trial was conducted. • Trial_Status-1 × 960 cell array with each cell containing information about whether that trial was a bad trial, that is, the one in which the participant did not move, was delayed in moving or had inappropriate movement. • Condition_Names-1 × 960 string array of condition names indicating the size and distance of object, perturbation, and the timing of perturbation (see Table 4). • Variable_Names-1 × 10 string array of variable names and measurement units corresponding to each column in the data matrix for each trial in the 'Raw_Trajectories' array. • Raw_Trajectories-1 × 960 cell array with n th column corresponding to n th trial. Each cell of this array contains a t × 10 matrix, where t is the number of samples captured by the motion capture system at 75 Hz for the respective trial. The next nine columns of this matrix correspond to the time stamp and the x-, y-, and z-coordinates of the wrist, thumb, and index finger markers (see Variable_Names). • Onset-1 × 960 numeric array with each cell containing the timing of movement onset for the respective trial based on switch release. • Offset-1 × 960 numeric array with each cell containing the timing of movement offset for the respective trial based on collision detection. • Corrected_Onset-1 × 960 numeric array with each cell containing the manually-selected timing of movement onset for the respective trial used to crop data during post-processing. Each trial was visually inspected in the post-processing stage. Movement onset was corrected if the onset was delayed (i.e., exceeded 3% of peak aperture) or untimely marked (i.e., there was no change in aperture for >2 samples).
Onset was corrected only for control trials. www.nature.com/scientificdata www.nature.com/scientificdata/ • Corrected_Offset-1 × 960 numeric array with each cell containing the manually-selected timing of movement offset for the respective trial used to crop data during post-processing. Each trial was visually inspected in the post-processing stage. Movement offset was corrected if the offset did not fall and remain below 3% of transport velocity of peak transport velocity. • Trajectories.mat-this file contains four arrays: • Resampled-35 × 1 cell array with each row corresponding to a different condition (see Condition_Names). Each cell of this array contains a 1 × n Trials cell array (n Trials = number of trials) with each cell containing data for an individual trial in 200 × 10 matrix. The columns of this matrix correspond to the time stamp and the x-, y-, and z-coordinates of the wrist, thumb, and index finger markers (see Variable_Names). • Condition_Names-35 × 1 string array of condition names indicating the size and distance of object, perturbation, and the timing of perturbation (see Table 4). • Variable_Names-1 × 10 string array of variable names and measurement units corresponding to each column in the data matrix for each trial.  www.nature.com/scientificdata www.nature.com/scientificdata/ • Condition_Names-35 × 1 string array of condition names indicating the size and distance of object, perturbation, and the timing of perturbation (see Table 4). • Features.mat-this file contains three arrays: • Features-18 × 1 cell array with each row corresponding to a different kinematic feature (see Feature_ Names). Each cell of this array contains a 35 × 1 cell array with each row corresponding to a different condition (see Condition_Names). Each cell of this array is a 1 × n Trials matrix with the columns containing data for individual trials. • Feature_Names-18 × 1 string array with the names and measurement units of kinematic features (see Table 3).
• Condition_Names-35 × 1 string array of condition names indicating the size and distance of object, perturbation, and the timing of perturbation (see Table 4).

Fig. 10
Normalized EMG for each of the 10 muscles in the control (no perturbation) and a distanceperturbation condition (perturbation applied at 300 ms after movement onset) for a representative trial. 'TS:' trial start; 'SR:' switch release. www.nature.com/scientificdata www.nature.com/scientificdata/ • EMG.mat-this file contains five arrays: • MVC-4000 × 11 matrix with the columns corresponding to the time stamp and EMG activity in each of the ten recorded muscles (see Variable_Names). • Variable_Names-1 × 11 string array of variable names and measurement units corresponding to each column in the data matrix for each trial. • Raw_EMG-35 × 1 cell array with each row corresponding to a different condition (see Condition_ Names). Each cell of this array contains a 1 × n Trials cell array (n Trials = number of trials) with each cell containing data for an individual trial in a 4000 × 11 matrix. The columns of this matrix correspond to the time stamp and EMG activity in the ten recorded muscles (see Variable_Names). • Condition_Names-35 × 1 string array of condition names indicating the size and distance of object, perturbation, and the timing of perturbation (see Table 4).  www.nature.com/scientificdata www.nature.com/scientificdata/ technical Validation Kinematic data: Effect of perturbations on reach-grasp coordination. Figures 4 and 5 show plots of the mean transport distance and aperture for the control (no perturbation) and a selected set of size-and distance-perturbation conditions, respectively, for a representative participant. Figures 6-9 describe phase plots of mean transport and aperture kinematics for the control and a selected set of size-and distance-perturbation conditions (perturbations applied at 100 ms, 200 ms, and 300 ms after movement onset) for a representative participant. As we presented in our previous work 46 , phase plots allow to distinguish the three phases of the reach-to-grasp coordination (i) Initiation Phase, which includes the initial acceleration of transport velocity and the first half of the hand opening, which begins with the rapid opening of the thumb and index finger. (ii) Shaping Phase, which begins at maximum transport velocity. It includes the first half of transport deceleration and the second half of the hand opening, which ends when the maximum aperture is achieved, marking the initiation of closure or closure onset, CO. (iii) Closure Phase, which includes the second half of transport deceleration and lasts until the object is grasped. Finally, Fig. 10 shows normalized EMG for each of the 10 muscles in the control and a distance-perturbation condition (perturbation applied at 300 ms after movement onset) for a representative trial. www.nature.com/scientificdata www.nature.com/scientificdata/ These figures provide a glimpse into the qualitative effects of visual perturbations of object size and distance on reach-grasp coordination.
It has been firmly established that the aperture component of reach-to-grasp movement is influenced by the object's physical dimensions, while the transport component remains relatively unaffected by changes in object size 8,47 . In contrast, the transport component of reach-to-grasp movement is influenced by the object's spatial location (i.e., the distance from the observer) and precision requirements due to object size, while the aperture component remains relatively unaffected by changes in object distance 7,48 . Hence, visual perturbations of object size evoke online adjustments in grasp aperture, and visual perturbations of object distance evoke online adjustments in transport velocity. Accordingly, to ensure all applied visual perturbations of object size and distance influenced the aperture and transport components in known ways, it was examined whether the size and distance perturbations influenced the peak aperture and peak transport velocity. To this end, movement time, peak aperture, and peak transport velocity were compared between each of the thirty size and distance perturbation conditions and the respective unperturbed condition (e.g., peak aperture for S→SM, S→M, S→ML, S→L each was compared to peak aperture for the S condition). Consistent with the findings from past studies conducted in the real world 49  ). Importantly, final aperture was always scaled to object size. These trends provide a strong validation of the expected responses to perturbations of object size and distance during reach-to-grasp. EMG data: Spectral properties of EMG signals. Each recorded EMG signal was validated via analysis of its spectral properties and then compared with known results from the literature. For each muscle for each trial, the power spectral density was calculated using Welch's method with a Hann window of 1024 samples (i.e., 1024 ms) and 50% overlap. An example is presented on Fig. 11 for the EMG signal obtained during MVC test both for inactive and active muscle. Power was normalized to the maximum power on the respective trial and averaged across all trials and subjects for each muscle. Figures 12 and 13 show the mean normalized power spectral densities for EMG collected in size and distance perturbation conditions, respectively. Signal energy was primarily contained within 0-400 Hz, which is typical for EMG 51,52 . Power line noise (60 Hz) or its harmonics was observed in a minority of muscles. This artifact was a narrow band and is amenable to standard filtering procedures. In a minority of muscles, a second artifact was observed ~74 Hz. We cannot explain this artifact, but it is also a narrow band, consistent, and amenable to filtering using a band stop filter.

Usage Notes
A major strength of the present dataset is that it provides reach-to-grasp kinematics and EMG data for a larger number of combinations of object size and distance and a large number of perturbations of object size and distance applied at three different times during the movement. Numerous examples of face validity, the degree to which our data appear to measure what was intended to be measured, are readily apparent in our data. For example, peak aperture increased with object size, peak velocity increased with object distance, and perturbed movements generally showed extended movement times compared to the analogous controls. However, the present dataset is also limited in several ways, mostly pertaining to our choice of the object type, grasp type, and kinematic recording. First, we used only one object type (a cuboid), whereas everyday reach-to-grasp movements involve diverse objects, often asymmetrical in shape. Second, the participants reached-to and grasped objects using the pincer grip, which involved only the thumb and index finger, which does not capture the full diversity of grasping movements associated with grasping the same object or objects of different size or shape 53 . Finally, we attached markers to the wrist, thumb, and index finger, which does not capture the hand's joint angle movement. These factors might limit the range of potential uses the present dataset, but it should not preclude the modeling of reach-to-grasp movements.

Code availability
The code used for post-processing of the kinematic data is available at https://github.com/tuniklab/scientific-data.