Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body posture estimation method based on relation analysis network

A technology of human pose and relationship, applied in the field of human pose estimation based on relationship analysis network, can solve the problem of missing detailed information, and achieve the effect of improving detection accuracy

Inactive Publication Date: 2021-01-12
北京北斗天巡科技有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to provide a human body posture estimation method based on a relationship analysis network, aiming to solve the problem that the current human body and object interaction detection methods directly use the whole human body features of unified concern, which may lose detailed information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture estimation method based on relation analysis network
  • Human body posture estimation method based on relation analysis network
  • Human body posture estimation method based on relation analysis network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0100] The human body posture estimation method based on the relation analysis network of the present embodiment, it specifically comprises the following steps:

[0101] Step 1. Given an input image, use Mask R-CNN to detect the bounding boxes b of all human bodies h and the object's bounding box b o and key points of human body kp h , using the detected human key points kp h Construct body part box b p,h .

[0102] Step 2. Extract features from the shared feature map of ResNet-50 C4 through the ROI Align operation, and feed the features into ResNet-50 C5 to obtain human body features f h , scene feature f s , object feature f o and body part features f p,h . In addition, scene features f are obtained by adaptive average pooling s , also extract features from the shared feature map of ResNet-50 C4, and feed the features into ResNet-50 C5.

[0103] Step 3. For each person's HOI detection, the Object-Bodypart map is composed of body parts and objects. In most cases, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body posture estimation method based on a relation analysis network, and belongs to the field of human body posture estimation and human and object interaction detection. In order to solve the problem that detailed information may be lost due to the fact that an existing human body and object interaction detection method directly uses the whole human body characteristics concerned in a unified mode, the method comprises the steps: firstly, giving an input image, and detecting bounding boxes bh of all human bodies, bounding boxes bo of objects and human body keypoints kph are detected; constructing a body part frame bp, h by using the detected human body key point kph; secondly, obtaining a human body feature fh, a scene feature fs, an object feature fo anda human body part feature fp, h; then, detecting the HOI of each person; then, predicting an action; and finally, calculating the HOI score. According to the method, the relationship between the bodypart and the object is more concerned, and the detection precision is improved by displaying the body part with the highest HOI score and the object pair.

Description

technical field [0001] The invention relates to the fields of human body pose estimation and interactive detection of people and objects, in particular to a human body pose estimation method based on a relationship analysis network. Background technique [0002] Most existing methods such as iCAN, HO-RCNN, IeractNet, and GPNN obtain detection results by considering a pair of human features and target features and combining spatial relationships. However, directly using the whole human body features of unified attention may lose the detailed information. Since humans interact with objects through body parts, body parts are most important for HOI detection. Contents of the invention [0003] The purpose of the present invention is to provide a human body posture estimation method based on a relationship analysis network, aiming to solve the problem that the current human body and object interaction detection methods directly use the whole human body features of unified conc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/10G06V10/40G06N3/047G06N3/048G06N3/045G06F18/2415G06F18/241
Inventor 刘超池明旻周鹏豪张文琦
Owner 北京北斗天巡科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products