Tayyub, J., Hawasly, M., Hogg, D. C. orcid.org/0000-0002-6125-9564 et al. (1 more author) (2017) CLAD: A Complex and Long Activities Dataset with Rich Crowdsourced Annotations. [Preprint - arXiv]
Abstract
This paper introduces a novel activity dataset which exhibits real-life and diverse scenarios of complex, temporally-extended human activities and actions. The dataset presents a set of videos of actors performing everyday activities in a natural and unscripted manner. The dataset was recorded using a static Kinect 2 sensor which is commonly used on many robotic platforms. The dataset comprises of RGB-D images, point cloud data, automatically generated skeleton tracks in addition to crowdsourced annotations. Furthermore, we also describe the methodology used to acquire annotations through crowdsourcing. Finally some activity recognition benchmarks are presented using current state-of-the-art techniques. We believe that this dataset is particularly suitable as a testbed for activity recognition research but it can also be applicable for other common tasks in robotics/computer vision research such as object detection and human skeleton tracking.
Metadata
Item Type: | Preprint |
---|---|
Authors/Creators: |
|
Keywords: | Activity Dataset, Crowdsourcing |
Dates: |
|
Institution: | The University of Leeds |
Academic Units: | The University of Leeds > Faculty of Engineering & Physical Sciences (Leeds) > School of Computing (Leeds) |
Depositing User: | Symplectic Publications |
Date Deposited: | 21 Nov 2024 16:04 |
Last Modified: | 21 Nov 2024 16:04 |
Identification Number: | 10.48550/arXiv.1709.03456 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:124339 |