Gesture imitation and recognition using Kinect sensor and extreme learning machines

Küçük Resim Yok

Tarih

2016

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Elsevier Sci Ltd

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Özet

This study presents a framework that recognizes and imitates human upper-body motions in real time. The framework consists of two parts. In the first part, a transformation algorithm is applied to 3D human motion data captured by a Kinect. The data are then converted into the robot's joint angles by the algorithm. The human upper-body motions are successfully imitated by the NAO humanoid robot in real time. In the second part, the human action recognition algorithm is implemented for upper-body gestures. A human action dataset is also created for the upper-body movements. Each action is performed 10 times by twenty-four users. The collected joint angles are divided into six action classes. Extreme Learning Machines (ELMs) are used to classify the human actions. Additionally, the Feed-Forward Neural Networks (FNNs) and K-Nearest Neighbor (K-NN) classifiers are used for comparison. According to the comparative results, ELMs produce a good human action recognition performance. (C) 2016 Elsevier Ltd. All rights reserved.

Açıklama

Anahtar Kelimeler

Human Action Recognition, Nao Humanoid Robot, Xbox 360 Kinect, Extreme Learning Machines

Kaynak

Measurement

WoS Q Değeri

Q1

Scopus Q Değeri

Q1

Cilt

94

Sayı

Künye