Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian re-recognition method based on multi-layer neural network for cross-scene pedestrian recognition

A multi-layer neural network and pedestrian re-identification technology, which is applied in character and pattern recognition, instruments, computer components, etc., can solve the problem of low recognition accuracy of cross-scene pedestrian re-identification models

Inactive Publication Date: 2019-01-01
CHANGZHOU UNIV
View PDF3 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The main purpose of the present invention is to use the nonlinear and high-precision processing capabilities of the neural network to provide an easy-to-operate and highly reliable cross-scene pedestrian re-identification method based on a multi-layer neural network, focusing on improving the existing method The defect that the recognition accuracy of the established cross-scene pedestrian re-identification model is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-recognition method based on multi-layer neural network for cross-scene pedestrian recognition
  • Pedestrian re-recognition method based on multi-layer neural network for cross-scene pedestrian recognition
  • Pedestrian re-recognition method based on multi-layer neural network for cross-scene pedestrian recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in combination with specific examples and with reference to the accompanying drawings.

[0065] The overall implementation flow chart of the present invention is as figure 1 As shown, the specific implementation is as follows:

[0066] Step 1. Collect the pedestrian video in the camera under the current scene, and intercept the video frame. This embodiment uses the data set i-LIDS image as the video frame image under the current scene;

[0067] Step 2. Perform feature extraction and dimensionality reduction processing on the images of pedestrians in the video frames of the dataset i-LIDS; the i-LIDS dataset includes 476 images of 119 pedestrians in total. For normalization processing, the pixels of each image are processed into 128×48. In this embodiment, an area block with a pixel size of 16×16 is taken, and e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for re-identifying pedestrians across scenes based on a multi-layer neural network, which comprises the following steps: (1) collecting pedestrian video in a camera under a current scene, and intercepting video frames; (2) feature extraction and dimension reduction of the pedestrian image in the video frame intercepted in the step (1) being carried out, and the training set Xt of the target domain being constituted by using sample pairs to obtain a test set Xo; (3) processing the identified data under the relevant scenarios, and using sample pairs to constitutethe source domain training set Xs; (4) establishing training set X, X= [Xs, Xt]; (5) using X to train the multi-layer neural network model; (6) identifying the sample to be identified in the test setXo according to the model obtained in step 5. The invention selects a multi-layer neural network with complex non-linear mapping as a learning model, and adds the identification data of the relevantscene to the model learning of the new scene by using the transfer learning idea, so that the learning of the new scene is more accurate and effective.

Description

technical field [0001] The present invention relates to the field of computer vision and pattern recognition, in particular to a method for cross-scene pedestrian re-identification based on a multi-layer neural network Background technique [0002] With the popularization of a large number of cameras in public places, the application of pictures and video data based on pedestrians has gradually received more and more attention. One of the important applications is pedestrian re-identification. Pedestrian re-identification refers to the technology of finding the same pedestrian through multi-time segment pedestrian video data collected by non-overlapping cameras. Today, as public safety is increasingly valued, pedestrian re-identification has attracted more and more attention. With the application of pedestrian re-identification in various new fields, a very important technical issue under actual conditions is how to deploy a pedestrian re-identification system in a new scen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/103G06V10/50G06V10/467G06V10/56G06F18/2135G06F18/214
Inventor 顾晓清倪彤光王洪元
Owner CHANGZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products