Background & Summary

The hand is a complex system, with many degrees of freedom (DoF), that enables humans to perform a large variety of grasping and manipulation actions required in activities of daily living (ADL), using a wide range of objects. Hand kinematics is being studied for purposes such as characterizing healthy hand movement patterns1, assessing patients’ abilities2 or the effect of object design on grasping3. Furthermore, with the rise in robotics and prosthetics, it has become crucial for the development of anthropomorphic systems4. For these purposes, and because of the versatility of the hand, a large amount of kinematic data (for all hand DoF) is needed to cover the interaction with the different objects used in different environments. Continuous recording of kinematics is essential to characterize the range of motion and velocities required for the different phases of reaching, grasping, manipulating and releasing. Moreover, data presented as anatomical angles are more meaningful and facilitate the comparison of data from different experiments independently of the motion capture system used. In this sense, several researchers5 have pointed out the importance of high-quality open-access datasets of grasping data, while also highlighting the need to compile, classify and standardize these data.

The Hand Corpus open repository (http://www.handcorpus.org) was created to undertake these goals, as it allows scientists to share grasping and manipulating data collected using different motion capture technologies. Nevertheless, the datasets in this repository, as well as the other datasets in the literature, present some weaknesses regarding their usability in machine learning, hand kinematics characterization or clinical evaluation. Some datasets offer limitations regarding the amount of data presented, are limited to grasp type classification6,7 or consider hand kinematics from just three markers on the hand8. Furthermore, datasets with several DoF present other limitations:

  • Tasks: Only reaching and grasping movements9,10,11,12,13,14, static grasp postures9,10,12,14,15,16,17 or exploratory/haptic tasks18 were recorded during product manipulation. These tasks lack representativeness of ADL because of the limited range of activities considered but also because subjects performed the tasks following precise instructions.

  • Objects used: Some of the datasets recorded tasks simulating the use of objects, but not using any object11,15,16,17,19,20.

  • Type of data presented: Some datasets only provide raw data from the motion capture system (cameras or gloves)12,19,20 instead of offering anatomical angles.

  • Number of subjects: Some of those datasets provide data from only one subject9,11,15,16,17,19,20.

  • Number of hands studied: All the datasets cited only studied subjects’ dominant hand.

Table 1 shows an overview of different datasets focused on hand kinematics and their characteristics.

Table 1 Main characteristics of datasets focused on hand kinematics.

In this paper we present the KINE-ADL BE-UJI Dataset21, which contains a total of 1160 recordings with anatomical angles of both hands while performing feeding and cooking activities using a large variety of products. Experiments were performed by 20 healthy subjects while wearing CyberGlove instrumented gloves on both hands, 18 DOF being recorded in each hand at a frequency of 100 Hz. The main contribution of this dataset compared to others is the variety of objects used (66 objects), the in-depth study of representative feeding and cooking tasks (58 tasks, divided into 178 actions) and the freedom given to the subjects to perform the tasks. Moreover, the data were collected from both hands, which allows the study of hand coordination. It is also important that the sample of subjects was selected so as to be representative of the healthy adult population (with a controlled proportion of laterality and gender). Furthermore, the data presented is standardized, as it is presented as anatomical angles following the ISB sign criteria22. The dataset consists of a Matlab/GNU Octave data structure (.mat) (provided also in .csv format) with kinematic data and data about the subjects recruited (age, gender, laterality, weight, height, hand length, hand width and active range of motion (AROM) measured for each DoF). This .mat file is accompanied by a guide where information regarding the environment, tasks, objects, data acquisition system and file structure is detailed, thereby allowing the classification of information regarding these parameters.

Methods

Study participants

The study consisted of two experiments (A and B), with 20 subjects (10 males, 10 females) participating in each experiment. Only 15 subjects participated in both experiments, so that the total amount of subjects recruited was 25. In both experiments, two of the 20 subjects were left-handed. The mean age of subjects recruited was 35.5 ± 7.67 years in experiment A and 38.05 ± 9.52 years in experiment B. The criteria used to select subjects were gender parity in overall data, age between 20 and 65, no reported upper limb pathologies and laterality representative of the overall population (20% of data from left-handed individuals). Before the experiments, all participants gave their written informed consent. All the experiments were performed in accordance with the Ethics Committee of the Universitat Jaume I.

Acquisition setup

Instrumentation

Data acquisition was performed using two CyberGlove (CyberGlove Systems LLC) instrumented gloves (CyberGlove II on the right hand and CyberGlove III on the left hand) connected to a laptop. Each of these gloves has 18 strain gauges that allow the anatomical angles of the underlying joints to be determined. The angle rotated by each joint with respect to the reference posture (hands resting flat on a table, with the fingers and thumb close together, and the middle fingers aligned with the forearms) is then calculated from these signals, according to a previously validated calibration protocol23. Furthermore, all the experiments were recorded on video, so as to be able to check the performance of the task when subsequently required.

Environment

The tasks were performed in a laboratory, within an environment that simulated a kitchen (Fig. 1), composed of: a refrigerator (Scenario 1), a high cabinet (Scenario 2), shelves (Scenario 3), a small worktop (Scenario 4), a sink and a rubbish bin (Scenario 5), a large worktop (Scenario 6), a low cabinet with a drawer in its upper part and shelves in the lower part, which has a door (Scenario 7), a table and a chair (Scenario 8) and an oven (Scenario 9).

Fig. 1
figure 1

Different scenarios of the experiment. Scenarios: Refrigerator (1), high cabinet (2), shelves (3), small worktop (4), sink and a rubbish bin (5), large worktop (6), low cabinet with a drawer in its upper part and shelves in the lower part (7), a table and a chair (8), and an oven (9).

Objects

A total of 66 objects were used to perform the tasks in the experiments (further information regarding their characteristics can be found in the guide attached to the dataset). The objects were chosen so as to be representative of those most commonly used in cooking and feeding tasks, and were checked to ensure they covered the cooking and feeding objects from the Yale-CMU-Berkeley Object and Model Set24, proposed by Calli et al. Some of the objects used were not real, in order to prevent the gloves from getting stained or wet. For example, the eggs to be broken had been previously emptied through a small hole made in the shell. All liquids were replaced by water, and materials such as flour or sugar that could have stained the gloves were replaced by durum wheat semolina. Pieces of polystyrene or cardboard were used to simulate biscuits, bread or crisps. The initial location of the objects in each scenario can be found in the detailed guide attached to the database. Figure 2 shows an overview of the objects used.

Fig. 2
figure 2

Overview of the objects used during the experiments. Objects labelled as in the guide attached to the dataset.

Acquisition protocol

The main dimensions of the hands were measured before helping the subject to put on the instrumented gloves following the manufacturer’s instructions. Participants were given clear instructions about how to perform the task, and they were told to start and end each task in the same posture: hands lying relaxed at both sides of the body for tasks performed in a standing posture, and hands lying relaxed on the table when sitting. While carrying out each task, the operator marked (or labelled) the time stamp of some specific events (using the glove software) that were later used to separate different phases or actions.

Recorded tasks

Two experiments (A and B) were performed. In experiment A, the activities performed were: preparing and having breakfast, baking a cake and cooking omelets. In experiment B, the activities were: setting the table, clearing the table and washing the dishes, making coffee and preparing a simple meal, considering the whole process of performing each task (taking the products from the different scenarios, transporting them, opening/using them and, in some cases, putting them back in their place). Furthermore, all these tasks were separated into different recordings (e.g. using the toaster or pouring and drinking milk), and these recordings were also separated into different elementary tasks (e.g. object grasping, manipulation such as opening tins/jars/bottles, transportation of objects, pouring liquid/solid substances, eating/drinking and other relevant actions). Therefore, experiment A was divided into 33 recordings and experiment B consisted of 25 (a description of all the recordings can be seen in Tables 2 and 3). Further information regarding the elementary tasks considered in each recording can be found in the guide attached to the dataset.

Table 2 Recordings in experiment A, where R is the ID number of the recording (100 onwards belong to experiment A), and S indicates whether the activity was performed sitting (x) or not.
Table 3 Recordings in experiment B, where R (200 onwards belong to experiment B) is the ID number of the recording and S indicates whether the activity was performed sitting (x) or not.

Some of the recordings were performed with the subject standing and others while sitting on a chair (as specified in Tables 2 and 3). Only the eating or drinking activities were simulated, by just bringing the food close to the mouth, and this has been indicated in the task description. The rest of the tasks were performed with realistic objects, and subjects were free to perform the tasks in the way they preferred.

Elementary tasks

As mentioned previously, each of the recordings (R) (33 recordings in experiment A and 25 in B) is composed of different elementary tasks. For example, in the activity of having breakfast (consisting of 11 records, as seen in Table 2) record R = 106 (pouring and drinking milk) is composed of four elementary tasks: opening the carton, pouring, closing the carton and drinking (see Table 4). For an unambiguous identification of each of the tasks, a unique ID was assigned to each elementary task, with a total of 99 elementary tasks in experiment A and 79 in B (178 elementary tasks altogether). All the elementary tasks involved grasping or manipulating a product or element with the hands, except for some cases where the subject moved without handling anything, which were labelled as “Displacement without manipulation”. For each elementary task, the record considers all time instants since the object was grasped until it was released. In those cases in which the object was released in a specific place or transported to a specific location in the scenario, this location is specified in the description of the elementary task. In all other cases, the release was performed on the surface closest to the subject (table, worktop, etc.).

Table 4 Elementary tasks into which task R = 106 is divided. Columns containing R (ID of the recording), ID (ID of the task), OBJ (ID of the objects used during the task), SCEN (ID of the scenario where the task is performed) and S (marked with an “x” when the task was performed sitting).

Active range of motion (AROM)

After performing all the experiments, subjects were asked to perform a set of postures25 in order to measure their AROM of the joints of both hands, which are presented in the .mat file, where subject information is also provided.

Signal processing

Angles calculation

Joint angles were calculated from raw data collected according to the calibration protocol proposed in previous works23. This protocol includes the determination of gains and also some corrections because of cross-coupling effects for specific anatomical angles. The anatomical angles obtained according to the protocol are those shown in Fig. 3.

Fig. 3
figure 3

List of recorded anatomical angles. Nomenclature: _F for flexion (in blue), _A for abduction (in yellow); 1 to 5, digits. Joints: IP for interphalangeal joint, PIP for proximal interphalangeal joints, MCP for metacarpophalangeal joints, CMC for carpometacarpal joints, PalmArch for palmar arch resulting from flexion/extension of carpometacarpal joints of ring and little fingers, WR for wrist.

Data cutting and splitting

The initial and final instants of each record, in which the hands were static, were trimmed. The records were then separated into the different elementary tasks as detailed in the dataset guide by using the labelling performed by the operator while recording the data. In some specific cases in which labelling data was missing, labelling was performed using the video recordings.

Filtering

All data were filtered with a 2nd order two-way low pass Butterworth filter with a cut-off frequency of 5 Hz.

Data Records

Volume of data collected

A total of 3560 elementary tasks were recorded across all the subjects and experiments, with a total duration of the recordings of 7 h, 30 min and 43 seconds.

Data files

Data is presented as a single Matlab data structure (BE_UJI_DATASET.mat)21, which is composed of two secondary structures (KINEMATIC_DATA and SUBJECT_DATA). KINEMATIC_DATA contains all kinematic data recorded, classified by experiment, record, part and subject, while SUBJECT_DATA contains data of the subjects recruited (age, gender, laterality, weight, height, hand length, hand width and measured AROM). This structure is accompanied by a guide (.pdf), which provides detailed information regarding the data series as well as the environment, tasks, objects and data acquisition system.

Sign criteria

The sign criteria used on each joint movement were defined as follows:

PIP(2–5)_F, IP1_F, CMC1_F, MCP(1–5)_F, WR_F: Flexion+/Extension−

MCP(2-3, 3-4, 4-5)_A: Fingers separated+/Fingers together−

PalmArch: Flexion+/Extension−

WR_A: Ulnar deviation+/Radial deviation−

CMC1_F: Flexion+/Extension− (See Fig. 4)

Fig. 4
figure 4

Movements of the carpometacarpal joint.

CMC1_A: Abduction+/Adduction− (See Fig. 4)

Notice that movement of thumb CMC joint is complex, and nomenclature used in literature to define these movements is varied26,27. We adopted the one used by Brand and Hollister27.

Technical Validation

Data acquisition

Before and after carrying out each experiment the subjects were asked to perform movements such as closing their hands or just moving them randomly, in order to make sure that all the gauges were shown to be working on the virtual model of the CyberGlove software.

Furthermore, all tasks recorded were checked in order to ensure that the number of labels used to divide them into elementary tasks was correct and that no labels were missing.

In order to avoid possible unexpected signal values, all data collected were filtered using a 2nd order two-way low pass Butterworth filter with a cut-off frequency of 5 Hz, as explained in previous sections.

Comparison of active and functional range of motion for each subject and experiment

The percentiles P95 and P5 were calculated for each hand joint, experiment and subject. Then, for each subject and experiment, a subject-specific functional range of motion (FROM) was computed for each hand joint angle as the P5 and P95 percentiles of all his/her recordings, therefore representing the angles of 90% of the postures performed by the subject during the experiment. These FROMs were compared with the AROMs measured for each subject. Almost all the FROMs were inside the AROMs, except in some cases where the extension of thumb interphalangeal and metacarpophalangeal joints and the index metacarpophalangeal joint extension were higher than the AROM measured (maximum difference reported between FROM and AROM was 25° approx.). This may be attributable to activities that implied a passive extension of these joints while manipulating objects (e.g. cutting with a knife implies a precision grasp with a forced extension of the thumb joints and index interphalangeal joint that is higher than the achievable active extension).

Statistical descriptive analysis of all data collected

With all data collected, box and whisker graphs were plotted and general FROMs were calculated. Then, the extreme values of all the subjects’ AROMs calculated previously were taken to calculate general AROMs. When general FROMs and AROMs were compared, most values of the FROMs were between those of the AROMs, which supports the veracity of the data. Nevertheless, some outliers were higher than those values (Fig. 5), especially in extension of CMC1, MCP2, PIP2 and PIP3 and flexion of right PIP2 to PIP5. This can also be attributable to activities that implied a passive flexion/extension of joints while manipulating, as mentioned before. It has to be emphasized that the FROMs of PIP2 to PIP5 were higher than the AROMs only for the right hand, which is the dominant hand of most subjects.

Fig. 5
figure 5

Comparison of maximum AROMs and FROMs. Box and whisker plots for general FROMs, general AROMs are marked with green lines. Unmarked AROMs were not measured. Joints and movements labelled as explained in Fig. 3.

Limitations

The use of instrumented gloves may imply some loss of dexterity during the performance of fine manipulation tasks. Nevertheless, this loss of dexterity may not have a significant effect on the ranges of motion, mean postures or movement synergies.

Usage Notes

These data can be used for several applications, from machine learning purposes to product design. The main strengths of this dataset for these potential uses are the motion capture characteristics (validity of the motion capture system, anatomical joints measured and frequency of acquisition), the structure of the data presented (.mat, which allows easy data handling), the variety of objects used (different shapes and weights) and the wide range of cooking/feeding tasks considered.

It has to be taken into account that real food or drinks were not used to perform the tasks in order to prevent the gloves from getting stained or wet (all products are appropriately tagged with the corresponding substitutive material in the dataset guide file). Therefore, tasks involving these elements were simulated and might be performed in a slightly different way than when performed with real food/drink.

Even though tasks and products of this dataset were selected to be representative of the different cooking and feeding tasks, some specific tasks or objects involving fine motor skills were discarded because of the loss of manipulation dexterity that the use of instrumented gloves implied (e.g. opening the thermally sealed plastic layer of precooked food packaging). Some wrist angles are also missing because of improper fitting of the wrist sensors to some subjects’ hands.

Finally, velocity of performance of the tasks might be slightly affected by the loss of dexterity and touch sensitivity resulting from the use of the instrumented gloves.