Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

3D human body posture estimation model training method

A technology for estimating models and human body poses, applied to biological neural network models, calculations, computer components, etc., can solve problems such as high time complexity and difficulty in obtaining high-precision 3D joint point coordinates of the human body

Pending Publication Date: 2020-08-11
ZHEJIANG UNIV OF TECH
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the problem that the existing 3D human body posture estimation method is difficult to obtain high-precision 3D joint point coordinates of the human body, and the time complexity is high, the present invention proposes a method using spatial anchor points with high accuracy and reduced calculation amount. 3D Human Pose Estimation Model Training Method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D human body posture estimation model training method
  • 3D human body posture estimation model training method
  • 3D human body posture estimation model training method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0046] Such as figure 1 Shown, a kind of 3D human body posture estimation model training method comprises the following steps:

[0047] Step 1: Obtain the 3D image of the target human body, input the image into the space transformation network, and output the enhanced image;

[0048] Step 2: Set the anchor point in the enhanced image space and input it to the feature extraction layer of the 3D pose estimation network to obtain the sample features;

[0049] Step 3: Use the predictor to predict the sample feature map, and obtain the voxel coordinate offset and confidence of each anchor point in the sample space feature map to all other human joint points, and use the confi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a 3D human body posture estimation model training method, which belongs to the technical field of intersection of digital image processing and machine learning, and comprises the following steps of constructing an attitude estimation network model, acquiring a target 3D sample image, setting anchor points on the sample image at a fixed interval, inputting an image matrix into the network model, and acquiring a trained posture estimation model after multiple rounds of iterative training, and using the trained network model to perform posture estimation on the image containing the posture of a human body. According to the 3D human body posture estimation model training method, the coordinates of the articulation points of the human body are predicted by adopting a method of spatially setting the anchor points, so that very high accuracy rate can be achieved; meanwhile, the calculation amount is much lower than that of traditional 3D attitude estimation, and the 3Dhuman body posture estimation model training method has a certain application prospect.

Description

technical field [0001] The invention belongs to the cross technical field of digital image processing and machine learning, and more specifically relates to a 3D human body pose estimation model training method. Background technique [0002] With the development of computer vision technology, the pose estimation technology for depth map and RGB map has developed rapidly. Pose estimation is widely used as a basic technology in human-computer interaction, augmented reality, human behavior analysis, medical rehabilitation, games and other fields. The following are the commonly used pose estimation algorithms: [0003] Existing human pose estimation methods are generally divided into three categories: model fitting methods, discriminative methods, and hybrid methods. The model fitting method uses an optimization method to fit a predefined hand model to the input depth image. The discriminative method is completely data-driven, and its goal is to learn a regressor through labe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/20G06N3/045G06F18/214
Inventor 吴哲夫肖新宇章莹婷李玮毅吕晓哲
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products