Cross-view gait recognition method combining LSTM and CNN

A gait recognition and combination technology, applied in the field of cross-view gait recognition, can solve problems such as application scene restrictions

Pending Publication Date: 2021-01-15
XI'AN UNIVERSITY OF ARCHITECTURE AND TECHNOLOGY
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, human joint point tracking is a difficult point in traditional 3D gait models, and reconstruction of 3D gait models using 3D equipment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-view gait recognition method combining LSTM and CNN
  • Cross-view gait recognition method combining LSTM and CNN
  • Cross-view gait recognition method combining LSTM and CNN

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention is described in further detail below in conjunction with accompanying drawing:

[0043] refer to figure 1 , the cross-view gait recognition method that LSTM of the present invention combines with CNN comprises the following steps:

[0044] 1) From the CASIA-B gait video data set, use OpenPose3D to extract the 3D pose data of pedestrians in the video;

[0045] From the CASIA-B gait video dataset released by the Institute of Automation, Chinese Academy of Sciences, collect OpenPose3D to extract 3D pose data of 124 pedestrians in the video in, Represents the three-dimensional coordinates of the i-th joint point in the m-th frame of the pedestrian, including 10 walking postures and 11 viewing angles under 3 states (backpack, wearing a coat, normal walking), a total of 124×10×11=13640 videos ,refer to figure 2 ;

[0046] 2) Extract the motion constraint data and joint action data of video pedestrian joint points from the 3D pose data obtained in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a cross-view gait recognition method combining LSTM and CNN, which comprises the following steps of: 1) extracting 3D (Three-Dimensional) attitude data of pedestrians in a video by adopting OpenPose3D from a CASIA-B gait video data set; 2) extracting motion constraint data and joint action data of video pedestrian joint points from the 3D attitude data obtained in the step1), constructing a pedestrian gait constraint matrix, and establishing a sample set; (3) constructing a 3D gait recognition network LC-POSEGAIT; 4) training a 3D gait recognition network LC-POSEGAIT by using the training set sample obtained in the step 2); and 5) utilizing the trained 3D gait recognition network LC-POSEGAIT to extract gait feature vectors of the video pedestrians, and completing cross-view gait recognition combining LSTM and CNN. The method can realize cross-view gait recognition, avoids preprocessing work of pedestrian detection and tracking, and can alleviate the influence of factors such as shooting angles, carried articles and illumination on gait recognition.

Description

technical field [0001] The invention belongs to the field of machine vision and relates to a cross-view gait recognition method combining LSTM and CNN. Background technique [0002] Biometric recognition has a wide range of applications in security monitoring, pedestrian tracking, identification and other fields. At present, there are a large number of camera monitoring equipment installed in public places, and the collected monitoring data is mostly used for evidence collection after the event, and seldom conducts identity verification for specific target pedestrians without perception, long distance, and uncontrolled conditions to support security warnings and Call the police. Gait is the performance of people's walking posture in the process of walking. It has the characteristics of non-contact, non-invasive, difficult to hide and forge. It is the only biological feature that can be obtained in a long-distance and uncontrolled state, and can support long-distance Identi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/25G06V10/44G06N3/044G06N3/045G06F18/253
Inventor 戚艳军孔月萍雷振轩李静朱旭东王佳婧
Owner XI'AN UNIVERSITY OF ARCHITECTURE AND TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products