Human body three-dimensional joint estimation frame based on depth network and positioning method thereof

A deep network and positioning method technology, applied in computing, image data processing, instruments, etc., can solve the problems of difficult to transplant camera sensors, weak generalization ability, etc., to achieve high stability, robustness, high portability, and improved accuracy Effect

Active Publication Date: 2018-12-25
SUN YAT SEN UNIV
View PDF3 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] (3) In order to preserve the constraint relationship between the bones and joints of the human body, an extremely complex model is designed, resulting in poor scalability and generalization capabilities of the model
[0008] Most of the existing 3D joint point posi

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body three-dimensional joint estimation frame based on depth network and positioning method thereof
  • Human body three-dimensional joint estimation frame based on depth network and positioning method thereof
  • Human body three-dimensional joint estimation frame based on depth network and positioning method thereof

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0046] The following describes the implementation of the present invention through specific specific examples in conjunction with the accompanying drawings. Those skilled in the art can easily understand other advantages and effects of the present invention from the content disclosed in this specification. The present invention can also be implemented or applied through other different specific examples, and various details in this specification can also be based on different viewpoints and applications, and various modifications and changes can be made without departing from the spirit of the present invention.

[0047] figure 1 It is a schematic diagram of the architecture of a human body 3D joint point estimation framework based on a deep network of the present invention. Such as figure 1 As shown, a human body 3D joint point estimation framework based on a deep network of the present invention includes:

[0048] The two-dimensional posture sub-network 101 is pre-trained on a la...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body three-dimensional joint estimation frame based on a depth network and a positioning method thereof. The frame comprises a two-dimensional posture sub-network, which is used for pre-training on a two-dimensional posture data set to extract two-dimensional posture features and transmit them to a two-dimensional-three-dimensional conversion module to generate precise two-dimensional prediction posture; the two-dimensional-three-dimensional conversion module configured to receive the two-dimensional posture features extracted by the two-dimensional posture sub-network and convert them into a three-dimensional posture feature space, and generate a three-dimensional posture rough estimation with consistent time sequence; a 3D-two-dimensional projection module, configured to convert the rough estimation of the intermediate three-dimensional pose estimated by the two-dimensional-three-dimensional conversion module to the two-dimensional space to generate two-dimensional projection posture, which corrects the estimated three-dimensional posture by optimizing the consistency between the two-dimensional projection posture and the two-dimensional predictionposture, and finally outputs accurate three-dimensional posture estimation with time-space consistency and two-dimensional three-dimensional geometrical consistency. The invention can improve the accuracy of the three-dimensional joint prediction positioning of the human body.

Description

technical field [0001] The present invention relates to the fields of human body three-dimensional pose estimation, computer vision and human-computer interaction, in particular to a human body three-dimensional joint point estimation framework based on a deep network and a positioning method for improving the positioning accuracy of human body three-dimensional joint points based on a self-supervised correction mechanism. Background technique [0002] Pose estimation is an important field of computer vision research. Its main task is to enable computers to automatically perceive and understand human behavior. Its applications include intelligent monitoring, patient monitoring, and some systems involving human-computer interaction. The goal of human 3D joint point positioning is to automatically infer the specific position of the human body in the real world from images containing people, and reconstruct the motion of the human body through this information, laying the founda...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/207
CPCG06T2207/20081G06T2207/20084G06T2207/20088G06T2207/30196G06T7/207
Inventor 林倞杨猛王可泽王青
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products