Real-time attitude estimation motion analysis method and system, computer equipment and storage medium

A technology of motion analysis and attitude estimation, applied in the field of computer vision, can solve problems such as joint connections that cannot be distinguished from people, less human body posture estimation, and inability to analyze multiple people at the same time

Pending Publication Date: 2021-01-22
FOSHAN UNIVERSITY
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The shortcomings of the current algorithm and the current situation: (1) High-precision sensors are required to collect motion information, resulting in increased motion costs; (2) the algorithm can only complete the pose estimation of a single person, and cannot analyze multiple people and multiple types of motion at the same time ; (3) Action analysis that cannot satisfy real-time performance
[0004] Traditional sports analysis methods such as coaches, wearable sports equipment, and teaching video guidance, on the one hand, may increase the cost of sports, on the other hand, they can only conduct one-to-one analysis, cannot analyze multiple people at the sa...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time attitude estimation motion analysis method and system, computer equipment and storage medium
  • Real-time attitude estimation motion analysis method and system, computer equipment and storage medium
  • Real-time attitude estimation motion analysis method and system, computer equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0078] Such as figure 1 As shown, the present embodiment provides a real-time pose estimation motion analysis method, the method includes the following steps:

[0079] S101. Acquire a real-time video of a user.

[0080] In this embodiment, the real-time video of the user is acquired through a monocular camera.

[0081] S102. Input the video frame into the trained dual-branch deep network for feature collection, and obtain the heat map of the joint points of the human body and the affinity regions between the joint points.

[0082] The dual-branch deep network of this embodiment adopts the VGG network, specifically the VGG-19 network, and its structure is as follows figure 2 As shown, the upper branch of the VGG-19 network is used to collect the position of the joint points of the human body, the lower branch of the VGG-19 network is used to collect the affinity area between the joint points, and the prediction results of the previous stage are used for feature fusion of vid...

Embodiment 2

[0133] Such as Figure 6 As shown, the present embodiment provides a real-time posture estimation motion analysis system, the system includes a video acquisition module 601, a feature acquisition module 602, a limb connection module 603, a posture correction module 604 and a motion analysis module 605, the specific functions of each module as follows:

[0134] Video acquisition module 601, used to acquire the real-time video of the user;

[0135] The feature collection module 602 is used to input the video frame into the trained double-branch deep network for feature collection, and obtain the joint point heat map of the human body and the affinity area between the joint points.

[0136] The limb connection module 603 is used to suppress the multi-peak points of the joint point heat map by non-maximum value, select a series of candidate joint points, connect the candidate joint points to form a bipartite graph, and optimize the bipartite graph.

[0137] The posture correctio...

Embodiment 3

[0141] This embodiment provides a computer device, which can be a computer, such as Figure 7As shown, a processor 702, a memory, an input device 703, a display 704 and a network interface 705 are connected through a system bus 701, the processor is used to provide computing and control capabilities, and the memory includes a non-volatile storage medium 706 and an internal memory 707, the non-volatile storage medium 706 stores an operating system, a computer program, and a database, the internal memory 707 provides an environment for the operation of the operating system and the computer program in the non-volatile storage medium, and the processor 702 executes the During computer program, realize the real-time pose estimation motion analysis method of above-mentioned embodiment 1, as follows:

[0142] Obtain the user's real-time video;

[0143] Input the video frame into the trained dual-branch deep network for feature collection, and obtain the joint point heat map of the h...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a real-time attitude estimation motion analysis method and system, computer equipment and a storage medium. The method comprises the following steps: acquiring a real-time video of a user; inputting the video frame into a trained double-branch deep network for feature acquisition to obtain a joint point heat map of a human body and an affinity region between joint points; performing non-maximum suppression on multiple peak values of the joint point thermograph, selecting and obtaining a series of candidate joint points, connecting the candidate joint points with one another to form a bipartite graph, and optimizing the bipartite graph; according to the optimized bipartite graph, performing distortion correction on joint pixel points between adjacent video frames when the user moves in real time, calculating limb angle information, and obtaining limb movement data; and after a consulting instruction sent by the user is received, performing motion analysis on thelimb motion data, and outputting a motion analysis result. According to the invention, the image space features can be more effectively reserved, and unnecessary influence when optimal connection is found between the established joint points is eliminated.

Description

technical field [0001] The invention relates to a real-time attitude estimation motion analysis method, system, computer equipment and storage medium, belonging to the field of computer vision. Background technique [0002] With the implementation of the national fitness program and more and more people participating in various sports, the analysis of human body movement is particularly important. In recent years, with the rapid development of big data and artificial intelligence technology, more and more people use mobile phone videos, applications, etc. as guidance, and wear various expensive equipment to help them correct their exercise posture. Analysis cannot better express the quality of motion, so the method of pose estimation is proposed to analyze the motion standard level of motion video, which provides more space and possibility for the development of motion analysis. [0003] Pose estimation belongs to the category of computer vision, and specifically refers to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246G06K9/00G06N3/04G06N3/08
CPCG06T7/246G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06T2207/30196G06V40/23G06N3/045
Inventor 曾凡智陈嘉文周燕刘紫琴邹磊
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products