Abstract
This work presents a database of human hand kinematics containing data collected during the performance of a wide variety of activities of daily living involving feeding and cooking. The data were recorded using CyberGlove instrumented gloves on both hands measuring 18 degrees of freedom on each. A total of 20 subjects participated in each part of the experiment, and the objects and their arrangement were the same across subjects, although they performed the tasks in a natural non-directed way. This dataset contains a total of 1160 continuous calibrated recordings taken at 100 Hz during the performance of the tasks, with filtered signal. Statistical descriptive analyses from these data are presented. This database can be useful for machine learning purposes and prostheses control, as well as for the characterization of healthy human hand kinematics.
Design Type(s) | modeling and simulation objective • observation design |
Measurement Type(s) | hand kinematics |
Technology Type(s) | sensor |
Factor Type(s) | biological sex • handedness |
Sample Characteristic(s) | Homo sapiens • manus |
Machine-accessible metadata file describing the reported data (ISA-Tab format)
Similar content being viewed by others
Background & Summary
The hand is a complex system, with many degrees of freedom (DoF), that enables humans to perform a large variety of grasping and manipulation actions required in activities of daily living (ADL), using a wide range of objects. Hand kinematics is being studied for purposes such as characterizing healthy hand movement patterns1, assessing patients’ abilities2 or the effect of object design on grasping3. Furthermore, with the rise in robotics and prosthetics, it has become crucial for the development of anthropomorphic systems4. For these purposes, and because of the versatility of the hand, a large amount of kinematic data (for all hand DoF) is needed to cover the interaction with the different objects used in different environments. Continuous recording of kinematics is essential to characterize the range of motion and velocities required for the different phases of reaching, grasping, manipulating and releasing. Moreover, data presented as anatomical angles are more meaningful and facilitate the comparison of data from different experiments independently of the motion capture system used. In this sense, several researchers5 have pointed out the importance of high-quality open-access datasets of grasping data, while also highlighting the need to compile, classify and standardize these data.
The Hand Corpus open repository (http://www.handcorpus.org) was created to undertake these goals, as it allows scientists to share grasping and manipulating data collected using different motion capture technologies. Nevertheless, the datasets in this repository, as well as the other datasets in the literature, present some weaknesses regarding their usability in machine learning, hand kinematics characterization or clinical evaluation. Some datasets offer limitations regarding the amount of data presented, are limited to grasp type classification6,7 or consider hand kinematics from just three markers on the hand8. Furthermore, datasets with several DoF present other limitations:
-
Tasks: Only reaching and grasping movements9,10,11,12,13,14, static grasp postures9,10,12,14,15,16,17 or exploratory/haptic tasks18 were recorded during product manipulation. These tasks lack representativeness of ADL because of the limited range of activities considered but also because subjects performed the tasks following precise instructions.
-
Objects used: Some of the datasets recorded tasks simulating the use of objects, but not using any object11,15,16,17,19,20.
-
Type of data presented: Some datasets only provide raw data from the motion capture system (cameras or gloves)12,19,20 instead of offering anatomical angles.
-
Number of subjects: Some of those datasets provide data from only one subject9,11,15,16,17,19,20.
-
Number of hands studied: All the datasets cited only studied subjects’ dominant hand.
Table 1 shows an overview of different datasets focused on hand kinematics and their characteristics.
In this paper we present the KINE-ADL BE-UJI Dataset21, which contains a total of 1160 recordings with anatomical angles of both hands while performing feeding and cooking activities using a large variety of products. Experiments were performed by 20 healthy subjects while wearing CyberGlove instrumented gloves on both hands, 18 DOF being recorded in each hand at a frequency of 100 Hz. The main contribution of this dataset compared to others is the variety of objects used (66 objects), the in-depth study of representative feeding and cooking tasks (58 tasks, divided into 178 actions) and the freedom given to the subjects to perform the tasks. Moreover, the data were collected from both hands, which allows the study of hand coordination. It is also important that the sample of subjects was selected so as to be representative of the healthy adult population (with a controlled proportion of laterality and gender). Furthermore, the data presented is standardized, as it is presented as anatomical angles following the ISB sign criteria22. The dataset consists of a Matlab/GNU Octave data structure (.mat) (provided also in .csv format) with kinematic data and data about the subjects recruited (age, gender, laterality, weight, height, hand length, hand width and active range of motion (AROM) measured for each DoF). This .mat file is accompanied by a guide where information regarding the environment, tasks, objects, data acquisition system and file structure is detailed, thereby allowing the classification of information regarding these parameters.
Methods
Study participants
The study consisted of two experiments (A and B), with 20 subjects (10 males, 10 females) participating in each experiment. Only 15 subjects participated in both experiments, so that the total amount of subjects recruited was 25. In both experiments, two of the 20 subjects were left-handed. The mean age of subjects recruited was 35.5 ± 7.67 years in experiment A and 38.05 ± 9.52 years in experiment B. The criteria used to select subjects were gender parity in overall data, age between 20 and 65, no reported upper limb pathologies and laterality representative of the overall population (20% of data from left-handed individuals). Before the experiments, all participants gave their written informed consent. All the experiments were performed in accordance with the Ethics Committee of the Universitat Jaume I.
Acquisition setup
Instrumentation
Data acquisition was performed using two CyberGlove (CyberGlove Systems LLC) instrumented gloves (CyberGlove II on the right hand and CyberGlove III on the left hand) connected to a laptop. Each of these gloves has 18 strain gauges that allow the anatomical angles of the underlying joints to be determined. The angle rotated by each joint with respect to the reference posture (hands resting flat on a table, with the fingers and thumb close together, and the middle fingers aligned with the forearms) is then calculated from these signals, according to a previously validated calibration protocol23. Furthermore, all the experiments were recorded on video, so as to be able to check the performance of the task when subsequently required.
Environment
The tasks were performed in a laboratory, within an environment that simulated a kitchen (Fig. 1), composed of: a refrigerator (Scenario 1), a high cabinet (Scenario 2), shelves (Scenario 3), a small worktop (Scenario 4), a sink and a rubbish bin (Scenario 5), a large worktop (Scenario 6), a low cabinet with a drawer in its upper part and shelves in the lower part, which has a door (Scenario 7), a table and a chair (Scenario 8) and an oven (Scenario 9).
Objects
A total of 66 objects were used to perform the tasks in the experiments (further information regarding their characteristics can be found in the guide attached to the dataset). The objects were chosen so as to be representative of those most commonly used in cooking and feeding tasks, and were checked to ensure they covered the cooking and feeding objects from the Yale-CMU-Berkeley Object and Model Set24, proposed by Calli et al. Some of the objects used were not real, in order to prevent the gloves from getting stained or wet. For example, the eggs to be broken had been previously emptied through a small hole made in the shell. All liquids were replaced by water, and materials such as flour or sugar that could have stained the gloves were replaced by durum wheat semolina. Pieces of polystyrene or cardboard were used to simulate biscuits, bread or crisps. The initial location of the objects in each scenario can be found in the detailed guide attached to the database. Figure 2 shows an overview of the objects used.
Acquisition protocol
The main dimensions of the hands were measured before helping the subject to put on the instrumented gloves following the manufacturer’s instructions. Participants were given clear instructions about how to perform the task, and they were told to start and end each task in the same posture: hands lying relaxed at both sides of the body for tasks performed in a standing posture, and hands lying relaxed on the table when sitting. While carrying out each task, the operator marked (or labelled) the time stamp of some specific events (using the glove software) that were later used to separate different phases or actions.
Recorded tasks
Two experiments (A and B) were performed. In experiment A, the activities performed were: preparing and having breakfast, baking a cake and cooking omelets. In experiment B, the activities were: setting the table, clearing the table and washing the dishes, making coffee and preparing a simple meal, considering the whole process of performing each task (taking the products from the different scenarios, transporting them, opening/using them and, in some cases, putting them back in their place). Furthermore, all these tasks were separated into different recordings (e.g. using the toaster or pouring and drinking milk), and these recordings were also separated into different elementary tasks (e.g. object grasping, manipulation such as opening tins/jars/bottles, transportation of objects, pouring liquid/solid substances, eating/drinking and other relevant actions). Therefore, experiment A was divided into 33 recordings and experiment B consisted of 25 (a description of all the recordings can be seen in Tables 2 and 3). Further information regarding the elementary tasks considered in each recording can be found in the guide attached to the dataset.
Some of the recordings were performed with the subject standing and others while sitting on a chair (as specified in Tables 2 and 3). Only the eating or drinking activities were simulated, by just bringing the food close to the mouth, and this has been indicated in the task description. The rest of the tasks were performed with realistic objects, and subjects were free to perform the tasks in the way they preferred.
Elementary tasks
As mentioned previously, each of the recordings (R) (33 recordings in experiment A and 25 in B) is composed of different elementary tasks. For example, in the activity of having breakfast (consisting of 11 records, as seen in Table 2) record R = 106 (pouring and drinking milk) is composed of four elementary tasks: opening the carton, pouring, closing the carton and drinking (see Table 4). For an unambiguous identification of each of the tasks, a unique ID was assigned to each elementary task, with a total of 99 elementary tasks in experiment A and 79 in B (178 elementary tasks altogether). All the elementary tasks involved grasping or manipulating a product or element with the hands, except for some cases where the subject moved without handling anything, which were labelled as “Displacement without manipulation”. For each elementary task, the record considers all time instants since the object was grasped until it was released. In those cases in which the object was released in a specific place or transported to a specific location in the scenario, this location is specified in the description of the elementary task. In all other cases, the release was performed on the surface closest to the subject (table, worktop, etc.).
Active range of motion (AROM)
After performing all the experiments, subjects were asked to perform a set of postures25 in order to measure their AROM of the joints of both hands, which are presented in the .mat file, where subject information is also provided.
Signal processing
Angles calculation
Joint angles were calculated from raw data collected according to the calibration protocol proposed in previous works23. This protocol includes the determination of gains and also some corrections because of cross-coupling effects for specific anatomical angles. The anatomical angles obtained according to the protocol are those shown in Fig. 3.
Data cutting and splitting
The initial and final instants of each record, in which the hands were static, were trimmed. The records were then separated into the different elementary tasks as detailed in the dataset guide by using the labelling performed by the operator while recording the data. In some specific cases in which labelling data was missing, labelling was performed using the video recordings.
Filtering
All data were filtered with a 2nd order two-way low pass Butterworth filter with a cut-off frequency of 5 Hz.
Data Records
Volume of data collected
A total of 3560 elementary tasks were recorded across all the subjects and experiments, with a total duration of the recordings of 7 h, 30 min and 43 seconds.
Data files
Data is presented as a single Matlab data structure (BE_UJI_DATASET.mat)21, which is composed of two secondary structures (KINEMATIC_DATA and SUBJECT_DATA). KINEMATIC_DATA contains all kinematic data recorded, classified by experiment, record, part and subject, while SUBJECT_DATA contains data of the subjects recruited (age, gender, laterality, weight, height, hand length, hand width and measured AROM). This structure is accompanied by a guide (.pdf), which provides detailed information regarding the data series as well as the environment, tasks, objects and data acquisition system.
Sign criteria
The sign criteria used on each joint movement were defined as follows:
PIP(2–5)_F, IP1_F, CMC1_F, MCP(1–5)_F, WR_F: Flexion+/Extension−
MCP(2-3, 3-4, 4-5)_A: Fingers separated+/Fingers together−
PalmArch: Flexion+/Extension−
WR_A: Ulnar deviation+/Radial deviation−
CMC1_F: Flexion+/Extension− (See Fig. 4)
CMC1_A: Abduction+/Adduction− (See Fig. 4)
Notice that movement of thumb CMC joint is complex, and nomenclature used in literature to define these movements is varied26,27. We adopted the one used by Brand and Hollister27.
Technical Validation
Data acquisition
Before and after carrying out each experiment the subjects were asked to perform movements such as closing their hands or just moving them randomly, in order to make sure that all the gauges were shown to be working on the virtual model of the CyberGlove software.
Furthermore, all tasks recorded were checked in order to ensure that the number of labels used to divide them into elementary tasks was correct and that no labels were missing.
In order to avoid possible unexpected signal values, all data collected were filtered using a 2nd order two-way low pass Butterworth filter with a cut-off frequency of 5 Hz, as explained in previous sections.
Comparison of active and functional range of motion for each subject and experiment
The percentiles P95 and P5 were calculated for each hand joint, experiment and subject. Then, for each subject and experiment, a subject-specific functional range of motion (FROM) was computed for each hand joint angle as the P5 and P95 percentiles of all his/her recordings, therefore representing the angles of 90% of the postures performed by the subject during the experiment. These FROMs were compared with the AROMs measured for each subject. Almost all the FROMs were inside the AROMs, except in some cases where the extension of thumb interphalangeal and metacarpophalangeal joints and the index metacarpophalangeal joint extension were higher than the AROM measured (maximum difference reported between FROM and AROM was 25° approx.). This may be attributable to activities that implied a passive extension of these joints while manipulating objects (e.g. cutting with a knife implies a precision grasp with a forced extension of the thumb joints and index interphalangeal joint that is higher than the achievable active extension).
Statistical descriptive analysis of all data collected
With all data collected, box and whisker graphs were plotted and general FROMs were calculated. Then, the extreme values of all the subjects’ AROMs calculated previously were taken to calculate general AROMs. When general FROMs and AROMs were compared, most values of the FROMs were between those of the AROMs, which supports the veracity of the data. Nevertheless, some outliers were higher than those values (Fig. 5), especially in extension of CMC1, MCP2, PIP2 and PIP3 and flexion of right PIP2 to PIP5. This can also be attributable to activities that implied a passive flexion/extension of joints while manipulating, as mentioned before. It has to be emphasized that the FROMs of PIP2 to PIP5 were higher than the AROMs only for the right hand, which is the dominant hand of most subjects.
Limitations
The use of instrumented gloves may imply some loss of dexterity during the performance of fine manipulation tasks. Nevertheless, this loss of dexterity may not have a significant effect on the ranges of motion, mean postures or movement synergies.
Usage Notes
These data can be used for several applications, from machine learning purposes to product design. The main strengths of this dataset for these potential uses are the motion capture characteristics (validity of the motion capture system, anatomical joints measured and frequency of acquisition), the structure of the data presented (.mat, which allows easy data handling), the variety of objects used (different shapes and weights) and the wide range of cooking/feeding tasks considered.
It has to be taken into account that real food or drinks were not used to perform the tasks in order to prevent the gloves from getting stained or wet (all products are appropriately tagged with the corresponding substitutive material in the dataset guide file). Therefore, tasks involving these elements were simulated and might be performed in a slightly different way than when performed with real food/drink.
Even though tasks and products of this dataset were selected to be representative of the different cooking and feeding tasks, some specific tasks or objects involving fine motor skills were discarded because of the loss of manipulation dexterity that the use of instrumented gloves implied (e.g. opening the thermally sealed plastic layer of precooked food packaging). Some wrist angles are also missing because of improper fitting of the wrist sensors to some subjects’ hands.
Finally, velocity of performance of the tasks might be slightly affected by the loss of dexterity and touch sensitivity resulting from the use of the instrumented gloves.
Code Availability
The custom Matlab code used to calculate joint angles is freely available on Zenodo28.
References
Santello, M., Flanders, M. & Soechting, J. F. Postural hand synergies for tool use. J. Neurosci. 18, 10105–10115 (1998).
Carpinella, I., Mazzoleni, P., Rabuffetti, M., Thorsen, R. & Ferrarin, M. Experimental protocol for the kinematic analysis of the hand: Definition and repeatability. Gait Posture 23, 445–454 (2006).
Ma, H.-I., Hwang, W.-J., Chen-Sea, M.-J. & Sheu, C.-F. Handle size as a task constraint in spoon-use movement in patients with Parkinson’s disease. Clin. Rehabil. 22, 520–528 (2008).
Grebenstein, M. et al. The hand of the DLR hand arm system: Designed for interaction. Int. J. Rob. Res. 31, 1531–1555 (2012).
Bianchi, M., Bohg, J. & Sun, Y. Latest Datasets and Technologies Presented in the Workshop on Grasping and Manipulation Datasets, https://arxiv.org/abs/1609.02531, (2016).
Bullock, I. M., Feix, T. & Dollar, A. M. The Yale human grasping dataset: Grasp, object, and task data in household and machine shop environments. Int. J. Rob. Res. 34, 251–255 (2015).
Saudabayev, A., Rysbek, Z., Khassenova, R. & Varol, H. A. Human grasping database for activities of daily living with depth, color and kinematic data streams. Sci. Data 5, 180101 (2018).
Mandery, C., Terlemez, Ö., Do, M., Vahrenkamp, N. & Asfour, T. The KIT whole-body human motion database. Proc. 17th Int. Conf. Adv. Robot. ICAR 2015 611909, 329–336 (2015).
Katsiaris, P., Artemiadis, P. & Kyriakopoulos, K. Hand Corpus, http://www.handcorpus.org/?p=100 (2010).
Santello, M., Flanders, M. & Soechting, J. F. Hand Corpus, http://www.handcorpus.org/?p=97 (2012).
Gabiccini, M., Stillfried, G., Marino, H. & Bianchi, M. Hand Corpus, http://www.handcorpus.org/?p=1156 (2013).
Deimel, R. & Brock, O. Hand Corpus., http://www.handcorpus.org/?p=1507 (2015).
Della Santina, C. et al. Hand Corpus, http://www.handcorpus.org/?p=1855 (2018).
Atzori, M. et al. Ninaweb, http://ninapro.hevs.ch/ (2014).
Bianchi, M., Salaris, P. & Bicchi, A. Hand Corpus, http://www.handcorpus.org/?p=103 (2011).
Santello, M., Flanders, M. & Soechting, J. F. Hand Corpus, http://www.handcorpus.org/?p=91 (2011).
Stillfried, G. Hand Corpus, http://www.handcorpus.org/?p=1109 (2013).
Gabiccini, M., Stillfried, G., Marino, H. & Bianchi, M. Hand Corpus, http://www.handcorpus.org/?p=1578 (2015).
Gabiccini, M., Stillfried, G., Marino, H. & Bianchi, M. Hand Corpus, http://www.handcorpus.org/?p=1298 (2014).
Gabiccini, M., Stillfried, G., Marino, H. & Bianchi, M. Hand Corpus, http://www.handcorpus.org/?p=1354 (2014).
Roda-Sales, A., Vergara, M., Sancho-Bru, J. L., Gracia-Ibáñez, V. & Jarque-Bou, N. J. Mendeley Data. https://doi.org/10.17632/8mf4y2srgh (2019).
Wu, G. et al. ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion - Part II: Shoulder, elbow, wrist and hand. J. Biomech. 38, 981–992 (2005).
Gracia-Ibáñez, V., Vergara, M., Buffi, J. H., Murray, W. M. & Sancho-Bru, J. L. Across-subject calibration of an instrumented glove to measure hand movement for clinical purposes. C. Comput. Methods Biomech. Biomed. Eng. 20, 587–597 (2017).
Calli, B., Wallsman, A., Singfh, A. & Srinivasa, S. S. Benchmarking in Manipulation Research. IEEE Robot. Autom. Mag., https://doi.org/10.1109/MRA.2015.2448951 36–52 (2015).
Gracia-Ibáñez, V., Vergara, M., Sancho-Bru, J. L., Mora, M. C. & Piqueras, C. Functional range of motion of the hand joints in activities of the International Classification of Functioning, Disability and Health. J. Hand Ther. 30, 337–347 (2017).
Kapandji, A. I. The Physiology of the Joints. Volume I: Upper Limb (1998).
Brand, P. W. & Hollister, A. M. Clinical Mechanics of the Hand. (Mosby Publishing, 1999).
Gracia-Ibáñez, V., Jarque-Bou, N. J., Roda-Sales, A. & Sancho-Bru, J. L. BE-UJI Hand joint angles calculation code. Zenodo. https://doi.org/10.5281/zenodo.3357966 (2019).
Bianchi, M., Salaris, P. & Bicchi, A. Synergy-based hand pose sensing: Reconstruction enhancement. Int. J. Rob. Res. 32, 396–406 (2013).
Stillfried, G. Kinematic modelling of the human hand for robotics. https://elib.dlr.de/100591/1/Stillfried_phd_main_v1_4.pdf (Technischen Universität München, 2015).
Gabiccini, M., Stillfried, G., Marino, H. & Bianchi, M. A data-driven kinematic model of the human hand with soft-tissue artifact compensation mechanism for grasp synergy analysis. IEEE Int. Conf. Intell. Robot. Syst. 3738–3745, https://doi.org/10.1109/IROS.2013.6696890 (2013).
Deimel, R. & Brock, O. A novel type of compliant and underactuated robotic hand for dexterous grasping. Int. J. Rob. Res. 35, 161–185 (2016).
Liu, M. J., Xiong, C. H., Xiong, L. & Huang, X. L. Biomechanical characteristics of hand coordination in grasping activities of daily living. PLoS One 11, 1–16 (2016).
Liu, M. J., Xiong, C. H., Xiong, L. & Huang, X. L. Hand Corpus, http://www.handcorpus.org/?p=1596 (2016).
Puhlmann, S., Heinemann, F., Brock, O. & Maertens, M. A Compact Representation of Human Single-Object Grasping. In 2016 IEEE Int. Conf. Intell. Robot. Syst. 1954–1959, https://doi.org/10.1109/IROS.2016.7759308 (2016).
Puhlmann, S., Heinemann, F., Brock, O. & Maertens, M. Hand Corpus, http://www.handcorpus.org/?p=1830 (2016).
Della Santina, C. et al. Postural hand synergies during environmental constraint exploitation. Front. Neurorobot. 11, 1–14 (2017).
Atzori, M. et al. Electromyography data for non-invasive naturally-controlled robotic hand prostheses. Sci. Data 1, 1–13 (2014).
Acknowledgements
The authors would like to thank all subjects for their participation and Sonia Buralla for her support during the data collecting process and preparation of the material. This work was supported by projects MINECO DPI2014-52095-P and UJI grant PREDOC/2016/08.
Author information
Authors and Affiliations
Contributions
Alba Roda-Sales created the experimental protocol, acquired data, performed data analysis and wrote the paper. Margarita Vergara conceived and planned the project, conducted and supervised the experiment and reviewed the paper. Joaquín L. Sancho-Bru conceived and planned the project, conducted and supervised the experiment and reviewed the paper. Verónica Gracia-Ibáñez developed the protocol to calibrate the instrumented gloves, developed the Matlab code to calculate joint angles and participated in the creation of the experimental protocol. Néstor Jarque-Bou recruited subjects, helped with the development of the Matlab code to calculate joint angles and revised the data structure.
Corresponding author
Ethics declarations
Competing Interests
The authors declare no competing interests.
Additional information
Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
ISA-Tab metadata file
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
The Creative Commons Public Domain Dedication waiver http://creativecommons.org/publicdomain/zero/1.0/ applies to the metadata files associated with this article.
About this article
Cite this article
Roda-Sales, A., Vergara, M., Sancho-Bru, J.L. et al. Human hand kinematic data during feeding and cooking tasks. Sci Data 6, 167 (2019). https://doi.org/10.1038/s41597-019-0175-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41597-019-0175-6
This article is cited by
-
Coming in handy: CeTI-Age — A comprehensive database of kinematic hand movements across the lifespan
Scientific Data (2023)
-
Electromyography and kinematics data of the hand in activities of daily living with special interest for ergonomics
Scientific Data (2023)
-
Construction motion data library: an integrated motion dataset for on-site activity recognition
Scientific Data (2022)
-
A database of physical therapy exercises with variability of execution collected by wearable sensors
Scientific Data (2022)
-
Flipping food during grilling tasks, a dataset of utensils kinematics and dynamics, food pose and subject gaze
Scientific Data (2022)