Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body three-dimensional joint point prediction method based on grouping regression model

A regression model and prediction method technology, applied in character and pattern recognition, instruments, calculations, etc., can solve problems such as high cost and time cost, inability to meet the movement characteristics of limbs, unreal and reliable final results, and avoid internal chaos. , the effect of deepening influence and improving robustness

Active Publication Date: 2019-08-30
ANHUI UNIVERSITY
View PDF2 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] To sum up, the existing technical solutions cannot conform to the actual movement characteristics of human limbs, and the obtained process needs to calculate and store a large amount of data, which is costly and time-consuming, and the final effect is not real and reliable.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body three-dimensional joint point prediction method based on grouping regression model
  • Human body three-dimensional joint point prediction method based on grouping regression model
  • Human body three-dimensional joint point prediction method based on grouping regression model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0075] In this embodiment, firstly, the 2D joint detector is used to obtain the positions of the main joints of the human body in the picture, and then the two-dimensional position information of the joints is used to obtain the three-dimensional posture of the human body. The specific process is as figure 1 shown. In this embodiment, a more sophisticated 2d to 3d regression model is adopted. This model is implemented based on TensorFlow. It takes 45ms to perform a forward+backward pass (i.e. one forward+backward pass) in the graphics card of GTX1080, and the evaluation of the model is based on Two large human body pose data are Human3.6M and MPII.

[0076] Human3.6M is currently the largest public dataset for human 3D pose estimation. The dataset consists of 3.6 million pictures, and professional actors perform 15 kinds of daily activities, such as walking, eating, sitting, making phone calls and participating in discussions. Provide 2D and 3D real data of human joints.

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body three-dimensional articulation point prediction method based on a grouping regression model. The method comprises the following steps: collecting human body 2d articulation point detection data; inputting the 2d joint point coordinates into a regression network with the same structure, obtaining 3d joint positions in different groups, and merging the obtained key three-dimensional positions into an integral joint vector; constructing a joint point self-constraint network and a joint group self-constraint network through BiLSTM, and accumulating 3d joint points output by the two self-constraint networks to obtain a fine-tuned 3d prediction joint; and calculating the Euclidean distance between the 3d predicted joint and the 3d joint through a loss function. According to the invention, a grouping regression structure is adopted in combination with the characteristic of human limb joint movement independence, four limbs and a trunk are divided into different joint groups, the 3d positions of joint points in each group are predicted respectively, and meanwhile, in order to make a prediction result closer to a real human body posture, a human body joint self-constraint network is designed by utilizing BiLSTM to adjust the prediction result, so that the accuracy is improved.

Description

technical field [0001] The invention belongs to human body posture estimation technology, and in particular relates to a method for predicting three-dimensional joint points of a human body based on a grouping regression model. Background technique [0002] 3D human pose estimation is a challenging topic in the field of computer vision, which can be applied in fields such as virtual reality, action recognition, and human-computer interaction. The difficulty lies in that the collected image is a two-dimensional signal, and it is difficult to extract the depth information of the human body. Early methods used contour, shape, SIFT, HOG and other invariant features to reconstruct human poses. Although it is easy to implement, it has defects such as high algorithm complexity and low efficiency. In recent years, the theory of deep learning has been applied to 3D human pose estimation, and good results have been achieved. There are mainly two methods: [0003] 1. Learning an end...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/25
Inventor 王华彬何学胜贺莹秦愿徐晗张首平李宁森陶亮
Owner ANHUI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products