Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Single-person attitude estimation method based on multistage prediction feature enhanced convolutional neural network

A convolutional neural network and feature enhancement technology, applied in the field of computer vision, can solve the problems of sharp increase in the number of model parameters, unfavorable learning opportunities for skeleton point learning, and indistinguishable degree of difficulty of human pose skeleton points, so as to overcome the problems of human pose skeleton. The point feature representation is inaccurate, the accuracy and speed are improved, and the skeleton point feature is fine.

Active Publication Date: 2020-06-09
XIDIAN UNIV
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The implementation scheme is to correct the result of pose estimation at this stage by continuously using multiple cascaded stacked convolutional layers to fuse the characteristics of the input image to the output of each stage. In the process of model training, the skeleton points of human pose are not distinguished. The difficulty of detection, easy skeleton points occupy too many learning opportunities, which is not conducive to the learning of difficult skeleton points, making the feature representation of human pose skeleton points inaccurate, resulting in low accuracy of single-person pose estimation
In addition, the existing single-person pose estimation algorithm will obtain more accurate human pose results through at least six stages of feature fusion, and the number of model parameters will increase dramatically, resulting in a slowdown in the speed of single-person pose estimation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Single-person attitude estimation method based on multistage prediction feature enhanced convolutional neural network
  • Single-person attitude estimation method based on multistage prediction feature enhanced convolutional neural network
  • Single-person attitude estimation method based on multistage prediction feature enhanced convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific implementation examples.

[0029] refer to figure 1 , the present invention comprises the following steps:

[0030] (1) Obtain training set and test set:

[0031] Randomly select M image samples with real labels from the single-person pose estimation data set to form a training set, and select N image samples with real labels to form a test set, in which the number of categories of human skeleton points contained in each label is is P, the number of human skeleton points in each category is 1, M=2000, N=10000, P=14;

[0032] (2) Classify the skeleton points of the human body:

[0033] (2a) The test set is used as the input of the multi-stage feature fusion single-person pose estimation model. In this embodiment, the Hourglass model with high accuracy at this stage is used to predict the position of each type of human skeleton point for each image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a single-person attitude estimation method based on a multistage prediction feature enhanced convolutional neural network. The method comprises the following steps: obtaining atraining set and a test set; grading the human skeleton points; constructing a multi-stage prediction feature enhancement convolutional neural network; training the multi-stage prediction feature enhanced convolutional neural network; and obtaining a single-person attitude estimation result based on the trained multi-stage prediction feature enhanced convolutional neural network. According to themethod, a multi-stage prediction feature enhancement convolutional neural network is adopted, the difficulty degree of human body posture skeleton point prediction is distinguished, and a parameter-free feature enhancement module is adopted, so that the extracted skeleton point features are finer. Meanwhile, the parameter quantity of the model is greatly reduced, and the accuracy and speed of single-person posture estimation are effectively improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and relates to a single-person attitude estimation method, in particular to a single-person attitude estimation method based on a multi-level prediction feature enhanced neural network, which can be used in various fields of human behavior analysis technology. Background technique [0002] With the rapid development of modern information technology, the use of human behavior analysis technology to quickly and accurately analyze human behavior in big data has a very wide range of applications, such as security monitoring, motion analysis and human-computer interaction. Accurate acquisition of human body posture skeleton points is the basis of human behavior analysis technology. Compared with using Kinect depth camera to obtain human body posture information, human body posture estimation algorithm can directly obtain human body skeleton points from images obtained by RGB cameras, which is e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045Y02T10/40
Inventor 谢雪梅马丽华柴维路
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products