Neural network training method and three-dimensional gesture posture estimation method

A neural network training and gesture technology, which is applied in the fields of computer vision and deep learning, can solve problems such as errors, large detection errors, and large gesture detection limitations, so as to achieve good training effects, reduce occlusion effects, and reduce illumination changes and object occlusions. Effect

Inactive Publication Date: 2018-01-23
SHENZHEN INST OF FUTURE MEDIA TECH +1
View PDF4 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Sensor-based gesture and posture estimation technology refers to fixing sensors such as accelerometers and angular velocity meters on specific parts of the palm and fingers of a person; obtaining the position and motion state information of a specific part of the human hand through a wearable sensor device, and then using kinematics The method solves the state of the palm and fingers of the human hand, so as to achieve the effect of gesture posture estimation; this method has great limitations in gesture detection due to the need to wear sensor equipment, and is affected by factors such as the accuracy of the senso

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network training method and three-dimensional gesture posture estimation method
  • Neural network training method and three-dimensional gesture posture estimation method
  • Neural network training method and three-dimensional gesture posture estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention will be further described below with reference to the accompanying drawings and in combination with preferred embodiments.

[0038] Such as figure 1 As shown, the three-dimensional gesture estimation method of the preferred embodiment of the present invention includes the following steps:

[0039] S1: Collect the data set of the gesture depth map; specifically include the following steps:

[0040] S11: Use multiple depth cameras to collect gesture depth pictures of different people, collect multiple pictures containing many different angles and various gestures for each gesture of each person, and organize the collected pictures into a picture library;

[0041] S12: Label each picture in the picture library; the human hand skeleton contains multiple joint points, and each joint point has a certain degree of freedom. In order to accurately locate the detailed information of the gesture joint point position and posture, in this embodiment Mark the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a neural network training method and a three-dimensional gesture posture estimation method. The neural network training method comprises the steps of (S1) collecting adata set including multiple gesture depth images by a depth camera, (S2) using the data set in the step (S1) to train a random forest learner, (S3) using the random forest learner to segment the multiple gesture depth images in the data set in the step (S1) to obtain gesture sub images, processing the gesture sub images to obtain processed images, and carrying out random sequence division on the processed images and the multiple gesture depth images in the set in the step (S1) to form a training set and a testing set, and (S4) training a convolution neural network by using the training set andthe testing set obtained in the (S3) to obtain a network model. According to the three-dimensional gesture posture estimation method, the network mode is used to estimate a three-dimensional gestureposture in a single depth image. The concrete positions and states of palms and fingers in a gesture can be accurately identified.

Description

technical field [0001] The invention relates to the fields of computer vision and deep learning, in particular to a neural network training method and a three-dimensional gesture estimation method. Background technique [0002] In recent years, with the rapid development of computer vision and deep learning, virtual reality and augmented reality technologies have gradually become popular, and still have immeasurable development prospects. As an important means of human-computer interaction, gesture recognition technology has been highly concerned by the field of computer vision. Due to the many joints of the human hand, complex shapes, high degrees of freedom and easy occlusion, the gesture position can be quickly and accurately recognized. Hand movements have always been a problem. [0003] Traditional gesture estimation methods can generally be divided into two types: sensor-based and image-based. Sensor-based gesture and posture estimation technology refers to fixing se...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
Inventor 王好谦李达方璐王兴政张永兵戴琼海
Owner SHENZHEN INST OF FUTURE MEDIA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products