Article Users Activity Gesture Recognition on Kinect Sensor Using Convolutional Neural Networks and FastDTW for Controlling Movements of a Mobile Robot

  • Miguel Pfitscher
  • Daniel Welfer
  • Evaristo José do Nascimento
  • Marco Antonio de Souza Leite Cuadros
  • Daniel Fernando Tello Gamarra Universidade Federal Santa Maria
Keywords: Human gestures recognition, convolutional neural networks, Microsoft Kinect, MSRC-12 dataset, Mobile robot.

Abstract

In this paper, we use data from the Microsoft Kinect sensor that processes the captured image
of a person using and extracting the joints information on every frame. Then, we propose the creation of
an image derived from all the sequential frames of a gesture the movement, which facilitates training in a
convolutional neural network. We trained a CNN using two strategies: combined training and individual
training. The strategies were experimented in the convolutional neural network (CNN) using the
MSRC-12 dataset, obtaining an accuracy rate of 86.67% in combined training and 90.78% of accuracy
rate in the individual training.. Then, the trained neural network was used to classify data obtained from
Kinect with a person, obtaining an accuracy rate of 72.08% in combined training and 81.25% in
individualized training. Finally, we use the system to send commands to a mobile robot in order to control
it.

Downloads

Download data is not yet available.

Author Biography

Daniel Fernando Tello Gamarra, Universidade Federal Santa Maria

Professor Department of Control Engineering and Automation

Published
2019-04-04
How to Cite
Pfitscher, M., Welfer, D., do Nascimento, E., Cuadros, M. A., & Gamarra, D. F. (2019). Article Users Activity Gesture Recognition on Kinect Sensor Using Convolutional Neural Networks and FastDTW for Controlling Movements of a Mobile Robot. Inteligencia Artificial, 22(63), 121-134. https://doi.org/10.4114/intartif.vol22iss63pp121-134