Access the full text.
Sign up today, get DeepDyve free for 14 days.
References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.
AbstractThe intelligent training and assessment of gymnastics movements require studying motion trajectory and reconstructing the character animation. Microsoft Kinect has been widely used due to its advantages of low price and high frame rate. However, its optical characteristics are inevitably affected by illumination and occlusion. It is necessary to reduce data noise via specific algorithms. Most of the existing research focuses on local motion but lacks consideration of the whole human skeleton. Based on the analysis of the spatial characteristics of gymnastics and the movement principle of the human body, this paper proposes a dynamic and static two-dimensional regression compensation algorithm. Firstly, the constraint characteristics of human skeleton motion were analyzed, and the maximum constraint table and Mesh Collider were established. Then, the dynamic acceleration of skeleton motion and the spatial characteristics of static limb motion were calculated based on the data of adjacent effective skeleton frames before and after the collision. Finally, using the least squares polynomial fitting to compensate and correct the lost skeleton coordinate data, it realizes the smoothness and rationality of human skeleton animation. The results of two experiments showed that the solution of the skeleton point solved the problem caused by data loss due to the Kinect optical occlusion. The data compensation time of an effective block skeleton point can reach 180 ms, with an average error of about 0.1 mm, which shows a better data compensation effect of motion data acquisition and animation reconstruction.
Measurement Science Review – de Gruyter
Published: Dec 1, 2022
Keywords: Azure Kinect; motion capture; motion tracking; motion compensation
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.