Automated Guided Vehicle Using Kinect Sensor
In this project, we present a gesture human action recognition system for the development of a human-robot interaction (HRI) interface. Kinect cameras and the other sensor combination framework are used to obtain real-time tracking of a human action skeleton. Different gestures, performed by different persons, Quaternion’s of joint angles are first used as robust and significant features. Next, neural network (NN) classifiers are trained to recognize the different gestures. This work deals with different challenging tasks, such as the real-time implementation of a gesture human action recognition system and the temporal resolution of gestures. The HRI interface developed in this work includes three Kinect and robot that can be remotely controlled by one operator standing in front of one of the Kinect sensor. Moreover, the system is supplied with a people action for control of the robot.
How to Cite
Copyright (c) 2020 Pavan V, Kotresh H M, Shabarish C Y, LakshmiBai A, Vivek K
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.