Recognition of Complex Human Actions Based on Skeletal Pose Sequence Analysis
Abstract
One of the priority areas of computer vision technology development is the extraction of skeletal data from human images and the subsequent use of this data to solve a whole range of applied problems. The paper gives a brief overview of technologies for solving the problem of human action recognition, highlights the main approaches, describes limitations, advantages and disadvantages. The authors propose a new approach to the recognition of complex human actions based on the analysis of skeletal data dynamics and application of state machine. The approach used is multi-stage and combines the sequential use of a neural network model of human pose detection MoveNet, a custom enhanced feature extraction layer (PoseEnhancementLayer), as well as an algorithm for detecting a committed action based on the analysis of poses at bifurcation points of the action. The solution proposed by the authors allows action detection without additional model training, which provides flexibility and scalability. Testing on open datasets showed high accuracy of human pose classification and robustness to incomplete or noisy sequences. The results are relevant for applications in sports analytics, interactive learning, rehabilitation and medical monitoring.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Publication policy of the journal is based on traditional ethical principles of the Russian scientific periodicals and is built in terms of ethical norms of editors and publishers work stated in Code of Conduct and Best Practice Guidelines for Journal Editors and Code of Conduct for Journal Publishers, developed by the Committee on Publication Ethics (COPE). In the course of publishing editorial board of the journal is led by international rules for copyright protection, statutory regulations of the Russian Federation as well as international standards of publishing. 
Authors publishing articles in this journal agree to the following: They retain copyright and grant the journal right of first publication of the work, which is automatically licensed under the Creative Commons Attribution License (CC BY license). Users can use, reuse and build upon the material published in this journal provided that such uses are fully attributed.
 
							 
				 
							 
								 
								 
								 
								 
								 
								 
								 
								 
								 
								