Method for hand posture estimation from single color image based on attention mechanism

A color image, pose estimation technology, used in image enhancement, image analysis, image data processing and other directions

Active Publication Date: 2019-09-06
NAT UNIV OF DEFENSE TECH
View PDF11 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a method for hand pose estimation from a single color image based on attention mechanism, so as to solve the technical problems existing in the existing hand state estimation method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for hand posture estimation from single color image based on attention mechanism
  • Method for hand posture estimation from single color image based on attention mechanism
  • Method for hand posture estimation from single color image based on attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] as attached figure 1 As shown, the task of the present invention is to input an image containing a human hand, and obtain 21 hand node 3D poses through an end-to-end neural network to estimate the hand pose. In this embodiment, J is used to represent different joint points of the hand, and there are 21 joint points in the hand, and J={1,21}. W = {w J =(x,y,z),i.e.,J∈[1,21]} represents the 3D coordinates of the hand joint points. The input RGB image is I∈R w×h×3 , the segmented hand image is The segmented hand image is an image containing hands that is slightly larger than the hand region and smaller than the input image. R=(R x , R y , R z ) represents the rotation angle of the camera coordinate system relative to the world coordinate system. (u,v) is the 2D position of each hand joint point. We add Gaussian noise to the 2D joints to obtain a heat map with Gaussian noise. Each joint point corresponds to a heat map, so there are 21 heat maps P=p J (u,v), i.e.,...

Embodiment 2

[0063] This example verifies CFAM through experiments, runs on 1080ti based on TensorFlow, and sets the training batch size to 8. During the training process, when the loss value does not decrease for many times, the training is stopped and the Adam training strategy is adopted. In this embodiment, the learning rate is set to (1e-5, 1e-6, 1e-7), and the learning rate changes after 30000 steps and 60000 steps. In this embodiment, improvements and tests are made on joint heat map detection and gesture estimation. In the table, some errors for wrist predictions are 0 because two decimal places are reserved, and these errors are less than 0.01 and are rounded to 0. These errors are 0 because the prediction of wrist is more accurate, the error is less than 0.01.

[0064] This embodiment is based on a single labeled RGB image. The commonly used depth image-based gesture estimation datasets MSRA and NYU are not suitable for this embodiment. Therefore, this embodiment selects two ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method for carrying out hand posture estimation from a single color image based on an attention mechanism, which comprises the following steps of: obtaining the single color image with a hand state through an image obtaining device, and cutting out a hand region image from the single color image; extracting a 2D joint heat map according to the hand region image; cascadingthe 2D joint heat map and the hand area image, and obtaining a 3D hand joint point position and a rotation angle of the image acquisition device under an image acquisition device coordinate system according to an attention mechanism; and carrying out hand state estimation according to the position of the 3D hand joint point and the rotation angle. According to the invention, the 2D articulation point and the characteristics of the RGB image are fused in the channel level. The features of the color image and the 2D articulation point are cascaded and weight is replanned and the features of eachpart are reasonably planned and utilized. A channel attention mechanism is also introduced, so that the fusion effect of different types of feature maps is improved. Hand postures are accurately estimated through a fusion channel attention mechanism.

Description

technical field [0001] The invention belongs to a hand pose estimation method, in particular to a method for hand pose estimation from a single color image based on an attention mechanism. Background technique [0002] Gesture estimation plays an important role in computer science to allow computers to understand human gestures algorithmically. Gesture estimation based on computer vision enables people to communicate with machines more naturally. Its advantage is that it is less affected by the environment. Let the computer understand the instructions issued by people in a timely and accurate manner without any mechanical assistance. Gestures are timely, vivid, intuitive, flexible and vivid in the process of human-computer interaction, and can complete human-computer interaction silently, successfully breaking the gap between reality and virtuality. [0003] With the development of computer vision, gesture estimation no longer relies on traditional wearable devices, but di...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06T7/73
CPCG06T7/75G06T2207/10028G06V20/64G06V40/113G06V40/28G06F18/253
Inventor 蒋杰王翔汉郭延明高盈盈康来魏迎梅雷军
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products