Gesture imitation and recognition using Kinect sensor and extreme learning machines
dc.contributor.author | Yavsan, Emrehan | |
dc.contributor.author | Ucar, Aysegul | |
dc.date.accessioned | 2024-02-23T14:13:10Z | |
dc.date.available | 2024-02-23T14:13:10Z | |
dc.date.issued | 2016 | |
dc.department | NEÜ | en_US |
dc.description.abstract | This study presents a framework that recognizes and imitates human upper-body motions in real time. The framework consists of two parts. In the first part, a transformation algorithm is applied to 3D human motion data captured by a Kinect. The data are then converted into the robot's joint angles by the algorithm. The human upper-body motions are successfully imitated by the NAO humanoid robot in real time. In the second part, the human action recognition algorithm is implemented for upper-body gestures. A human action dataset is also created for the upper-body movements. Each action is performed 10 times by twenty-four users. The collected joint angles are divided into six action classes. Extreme Learning Machines (ELMs) are used to classify the human actions. Additionally, the Feed-Forward Neural Networks (FNNs) and K-Nearest Neighbor (K-NN) classifiers are used for comparison. According to the comparative results, ELMs produce a good human action recognition performance. (C) 2016 Elsevier Ltd. All rights reserved. | en_US |
dc.description.sponsorship | Firat University Scientific Research Projects Foundation [MF.12.33] | en_US |
dc.description.sponsorship | This paper was supported by the Firat University Scientific Research Projects Foundation (no. MF.12.33). | en_US |
dc.identifier.doi | 10.1016/j.measurement.2016.09.026 | |
dc.identifier.endpage | 861 | en_US |
dc.identifier.issn | 0263-2241 | |
dc.identifier.issn | 1873-412X | |
dc.identifier.scopus | 2-s2.0-84988699181 | en_US |
dc.identifier.scopusquality | Q1 | en_US |
dc.identifier.startpage | 852 | en_US |
dc.identifier.uri | https://doi.org/10.1016/j.measurement.2016.09.026 | |
dc.identifier.uri | https://hdl.handle.net/20.500.12452/12334 | |
dc.identifier.volume | 94 | en_US |
dc.identifier.wos | WOS:000390512100092 | en_US |
dc.identifier.wosquality | Q1 | en_US |
dc.indekslendigikaynak | Web of Science | en_US |
dc.indekslendigikaynak | Scopus | en_US |
dc.language.iso | en | en_US |
dc.publisher | Elsevier Sci Ltd | en_US |
dc.relation.ispartof | Measurement | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | Human Action Recognition | en_US |
dc.subject | Nao Humanoid Robot | en_US |
dc.subject | Xbox 360 Kinect | en_US |
dc.subject | Extreme Learning Machines | en_US |
dc.title | Gesture imitation and recognition using Kinect sensor and extreme learning machines | en_US |
dc.type | Article | en_US |