Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional human body posture estimation method based on monocular camera

A monocular camera and 3D attitude technology, applied in the field of computer vision, can solve problems such as poor robustness, low computing efficiency, and dependence on 3D human body models, and achieve the effects of wide adaptability, strong adaptability and robustness

Inactive Publication Date: 2018-10-12
SHANGHAI UNIV OF ENG SCI
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The invention provides a method for estimating the three-dimensional pose of a human body based on a monocular camera, which solves the problems that the existing methods rely too much on the pre-established three-dimensional human body model, have low computing efficiency, and poor robustness.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional human body posture estimation method based on monocular camera
  • Three-dimensional human body posture estimation method based on monocular camera
  • Three-dimensional human body posture estimation method based on monocular camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The specific implementation manner of the present invention will be described in detail below in conjunction with the accompanying drawings and preferred embodiments.

[0047] Refer to attached figure 1 , the present invention provides a method for estimating the three-dimensional pose of a human body based on a monocular camera, comprising the following steps:

[0048] Step 1. Establish a 3D human body model based on the standard proportion data of a normal human body, and obtain the 2D projection of the 3D human body model based on the weak perspective projection method. See the attached figure 2 , the three-dimensional human body model includes at least six parts including head, torso, left arm, left palm, right arm and right palm, and is provided with 40 degrees of freedom, of which 25 degrees of freedom are used to adjust the overall scaling of the human body model , rotation angle and joint angle, 15 degrees of freedom are used to adjust the length and width of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer vision, and discloses a three-dimensional human body posture estimation method based on a monocular camera. The method comprises the followingsteps that 1, a three-dimensional human body model is built on the basis of standard proportion data of a normal human body, and a two-dimensional projection of the three-dimensional human body modelis acquired on the basis of a weak perspective projection method; 2, a video of the human body in motion is acquired by means of the monocular camera, and the two-dimensional contour feature of each human body image frame of the video is extracted; and 3, the two-dimensional contour feature of each human body image frame is compared with the two-dimensional projection of the three-dimensional human body model, cost functions are established, and an optimal solution of the cost functions is calculated, so that the most matched three-dimensional model is obtained. According to the method, the defect of the high requirement of a general generative model method on model generation is overcome, and the method can be widely applied to posture and size estimation on human body objects in different postures and different shapes and has the advantages of being high in adaptability and robustness.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a method for estimating a three-dimensional posture of a human body based on a monocular camera. Background technique [0002] Vision-based human pose estimation has gradually become a research hotspot in the field of computer vision. Accurate pose estimation can provide better data basis and guidance for action recognition, behavior analysis, and behavior understanding. [0003] Today, there are two main categories of vision-based methods for 3D human pose estimation, including discriminative models and generative models. The discriminative model requires a large amount of training data to obtain an accurate pose estimation structure, but processing a large amount of data will reduce the efficiency of the algorithm, and when the real image is quite different from the training sample, the algorithm effect of the discriminative model is very unsatisfactory . ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/13G06T7/11G06T7/136G06T7/246G06T7/62G06T7/90
CPCG06T2207/10016G06T2207/10024G06T2207/30196G06T7/11G06T7/13G06T7/136G06T7/246G06T7/62G06T7/90
Inventor 吴晨谋方志军黄正能钱小瑞
Owner SHANGHAI UNIV OF ENG SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products