This dataset contains 70 (30 falls + 40 activities of daily living) sequences. Fall events are recorded with 2 Microsoft Kinect cameras and
corresponding accelerometric data. ADL events are recorded with only one device (camera 0) and accelerometer. Sensor data was collected using PS Move (60Hz) and
x-IMU (256Hz) devices.
The dataset is organized as follows. Each row contains sequence of depth and RGB images for camera 0 and camera 1 (parallel to the floor and ceiling mounted, respectively), synchronization data, and raw accelerometer data.
Each video stream is stored in separate zip archive in form of png image sequence. Depth data is stored in PNG16 format and should be rescaled:
where
is depth in millimeters,
is scale ratio for i-th camera and
is pixel value at position (x,y) of PNG16 image. Fall sequences:
and
. ADLs:
Synchronization data contains: frame number, time in milliseconds since sequence start and interpolated accelerometric data - corresponding to image frame. Note that the cameras are recorded independently, so they are not strictly synchronized (synchronization based on nearest timestamp value).
Raw accelerometric data contains time in milliseconds since sequence start and accelerometer data: . All accelerometer data are in gravity units (g). Total sum vector is calculated as follows:
If you use the dataset, please cite the following work:
Bogdan Kwolek, Michal Kepski, Human fall detection on embedded platform using depth maps and wireless accelerometer, Computer Methods and Programs in Biomedicine, Volume 117, Issue 3, December 2014, Pages 489-501, ISSN 0169-2607
[Link]
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License and is intended for non-commercial academic use. If you are interested in using the dataset for commercial purposes please contact us.