Full text
DOI
Share
|
AbstractWe present a multi-sensor dataset of bimanual human-to-human object handovers. The dataset consists of 240 recordings obtained from 12 pairs of participants performing bimanual object handovers with 10 objects, and 120 recordings obtained from the same 12 pairs of participants performing unimanual handovers with 5 of those objects. Each recording includes the giver and receiver’s 13 upper-body bone position and orientation trajectories, position trajectories for the 27 markers placed on their upper bodies, object position and orientation trajectories, and two RGB-D data streams. The motion trajectories are recorded at 120Hz and the RGB-D streams are recorded at 30Hz. The recordings are annotated with the three handover phases: reach, transfer, and retreat. The dataset also includes four anthropometric measurements of the participants: height, waistline height, arm span, and weight. Our dataset could help investigations of the bimanual reaching motions and grasps utilized by humans while performing handovers. Also, it can be used to train robots to perform bimanual object handovers with humans. |