Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body posture estimation method and recognition device

A technology of human body posture and recognition device, which is applied in the field of human-computer interaction and can solve the problems of loss of effective information, poor real-time performance of posture estimation algorithm, and low degree of visualization of recognition results.

Pending Publication Date: 2021-12-17
HARBIN UNIV OF SCI & TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] This application provides a human body pose estimation method and recognition device, which can solve the problems of loss of effective information, inaccurate recognition of occluded images, poor real-time performance of pose estimation algorithms, and low degree of visualization of recognition results in the process of resolution restoration.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture estimation method and recognition device
  • Human body posture estimation method and recognition device
  • Human body posture estimation method and recognition device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] An embodiment of the present disclosure provides a human body pose estimation method.

[0032] Specifically, see figure 1 , a human pose estimation method comprising:

[0033] Step 1. Pass the preprocessed image into the high-resolution network, and generate feature maps in each branch of the network after downsampling, upsampling, and convolution operations;

[0034] In this embodiment, a high-resolution network model is constructed, and the model is divided into 1 / 2 resolution layer, 1 / 4 resolution layer, 1 / 8 resolution layer and 1 / 16 resolution layer. Among them, the 1 / 2 resolution layer is the backbone layer, and the resolution is half of the resolution of the transmitted image, which can handle the pose estimation of small target characters in the image, and the other three resolution layers are the branch layers of the network. The 1 / 2 resolution layer is down-sampled to get the 1 / 4 resolution layer, the 1 / 4 resolution layer is down-sampled to get the 1 / 8 resolu...

Embodiment 2

[0044] An embodiment of the present disclosure provides a human body pose estimation and recognition device.

[0045] Specifically, see figure 2 , the human body posture estimation and recognition device includes: 1. camera; 11. compensation light source; 12. lifter; 13. lifting platform; 2. receiving antenna; 21. receiver; Power indicator light; 33. Elevation adjustment knob; 34. Angle adjustment knob; 4. Main control box; 41. Display screen.

[0046] In this embodiment, the camera 1 and the compensating light source 11 are placed immediately above the lifter 12, the receiver 21 is facing the lifter 12, the receiver 21 and the lifter 12 are installed on the left end of the base bracket 3, and the main control box 4 is located in the middle position directly above the bottom bracket 3, and above the right end of the bottom bracket 3 is the adjustment knob; the camera 1 is used to capture external portrait information and take pictures, and the adjustment knob is used to adju...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body posture estimation method and a recognition device. The method comprises the steps: a preprocessed image is transmitted into a high-resolution network, and a feature map isgenerated at each branch of the network through down-sampling, up-sampling and convolution operation; and an attention module is connected behind the generated feature map to capture key features, a new feature map is generated by fusing the feature map at the final stage with the feature map at the previous level, the new feature map is recovered to the resolution which is the same as that of a trunk through deconvolution operation, a final heatmap is obtained, and the position of each joint point can be predicted according to the heatmap. The image is processed by the high-resolution network, so that effective information loss can be avoided; feature information can be efficiently utilized by adding the attention module; the problems of inaccurate occlusion image attitude estimation and difficult small target prediction can be solved by interaction information of multiple resolution branches; and real-time detection visualization can be realized by transmitting an output result into the device.

Description

technical field [0001] The present application relates to the technical field of human-computer interaction, and in particular to a method for estimating human body posture and a recognition device. Background technique [0002] Human body pose estimation technology refers to the labeling of the joint positions of the human body in ordinary images or videos. According to the number of labeled objects, human pose estimation can be divided into single-person pose estimation and multi-person pose estimation. The single-person pose estimation task can be regarded as the joint position detection for determining the individual, and the joint position positioning is performed after finding the portrait position in the image. ; Multi-person pose estimation is more complex than single-person pose estimation. The existing technology mainly has two methods: top-down and bottom-up. Among them, the top-down method is to detect the portraits in the image first, and then perform single-pe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06F18/253
Inventor 柳长源秦川殷运福
Owner HARBIN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products