Teaching human gestures to humanoid robots by using Kinect sensor

Teaching human gestures to humanoid robots by using Kinect sensor In this study, a novel algorithm is developed to recognize human actions and reproduce human actions on a humanoid robot. The study consists of two parts. In the first part, the real time human imitation system is realized. The three dimensional skeleton joint positions obtained from Xbox 360 Kinect. These positions are transformed to joint angles of robotĀ armsĀ via a transformation algorithm and these angles are transferred to NAO robot. The human upper body movements are finally successfully imitated by NAO robot in real time. In the second part, the algorithm is generated for the recognition of human actions. Extreme Learning Machines (ELMs) and the Feed Forward Neural Networks (FNNs) with back propagation algorithm are used to classify actions. According to the comparative results, ELMs produce a better recognition performance.