Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gait recognition method based on depth neural network

A deep neural network and gait recognition technology, applied in the field of computer vision and pattern recognition, can solve the problems of insufficient data volume and large impact on recognition performance, achieve high robustness, improve convergence speed, and reduce resources and time. Effect

Inactive Publication Date: 2017-10-24
XIAN UNIV OF SCI & TECH
View PDF5 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the problem that the recognition performance of the existing gait recognition technology is greatly affected by many covariates, the present invention proposes a gait recognition method based on a deep neural network. Complicated background conditions show good robustness, and the recognition rate is significantly improved under the influence of complex scenes and multiple covariates; the fine-tuning of the trained network is used to effectively solve the problem of insufficient data and save calculations volume and running time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gait recognition method based on depth neural network
  • Gait recognition method based on depth neural network
  • Gait recognition method based on depth neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] In order to make the objects and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0019] like Figure 1-Figure 2 As shown, the embodiment of the present invention provides a kind of gait recognition method based on deep neural network, comprises the following steps:

[0020] S1, such as image 3 As shown in Fig. 1, the motion background is subtracted from the video sequence to obtain the contour map of human body movement, and the contour map is subjected to morphological processing to reduce the noise and fill the holes in the image, then extract the gait cycle, and further normalize the contour map Make it equal in size, calculate the gait Gaussian map in a gait cycle, and establish a sample data set of the gait Gaussian ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gait recognition method based on a depth neural network, and the method comprises the steps: carrying out the preprocessing of an original gait video image, extracting the gait Gaussian image of the original gait video image, and dividing a sample data set into a training set and a testing set at a proportion of 5: 1 according to a designed rule; building an eight-layer convolution neural network model consistent with an AlexNet structure, modifying the number of neurons of the last layer of the model, and enabling the model to adapt to a gait classification task; Carrying out the initialization of the parameters of a trained AlexNet model through the former seven layers of the model, and carrying out the random initialization of the last layer of the model; training the convolution neural network through the training set, and enabling the convolution neural network to effectively complete the gait recognition. The method is higher in robustness, can achieve the more effective recognition of the identity of a person under the condition of various types of concomitant variables, and can remarkably reduce the calculation resources and time of a training model.

Description

technical field [0001] The invention relates to computer vision and pattern recognition, in particular to a gait recognition method based on a deep neural network. Background technique [0002] In the existing gait recognition technology, the gait energy map as a gait feature descriptor has an ideal recognition effect and is simple to calculate. Some improvements have been made in the gait energy map and a Gaussian coefficient is added to it to obtain the gait Gauss The recognition effect of the graph is significantly better than that of the gait energy graph. Using the gait Gauss graph as a gait feature to select a suitable classifier can effectively identify the identity of a person. However, this kind of recognition is difficult to achieve the desired effect when the viewing angle span is large (more than 36 degrees) or the clothing and carrying conditions change significantly. [0003] Convolutional neural network is a multi-layer perceptron specially designed to recogn...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/08G06T7/254
CPCG06N3/084G06T7/254G06V40/103G06F18/214
Inventor 李占利胡阿敏
Owner XIAN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products