Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep Hash pedestrian re-identification method

A technology of pedestrian re-identification and deep hashing, which is applied in the field of pedestrian re-identification, can solve the problems of Hamming distance loss that is not easy to converge, and achieve the effect of benefiting loss calculation and model convergence, improving accuracy and reducing time

Active Publication Date: 2019-07-23
CHONGQING UNIV
View PDF8 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0015] The idea of ​​the present invention is to learn a simple and effective feature representation to implement effective calculation and storage. For this, the present invention combines the pedestrian re-identification method with the hash method to construct an end-to-end simple and easy-to-store feature representation. Discriminative hash feature network (end-to-end network refers to: the input is the original data, the output is the final result, the network that integrates feature extraction, feature matching, and hash learning); it is not easy to use Hamming distance loss For the problem of convergence, the present invention adopts the learning of triplet loss supervision hash code based on probability distance; the present invention also uses encoding and decoding to reconstruct (encoding and decoding: the original 2048-bit depth features are encoded to obtain 128 2048-bit code is obtained by decoding the hash code of 2048 bits, and the code after supervised coding and decoding is similar to the original code) to screen the discriminative parts in the global features, and then form a hash code, so that the obtained hash code can be Represents global features and maintains the discriminative nature of hash codes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep Hash pedestrian re-identification method
  • Deep Hash pedestrian re-identification method
  • Deep Hash pedestrian re-identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0104] 1. Dataset

[0105] Using the Market1501 dataset, collected by Zheng et al. in the campus scene and released in 2015, the dataset contains 1501 pedestrian ids, taken by 6 cameras, with a total of 32217 images.

[0106] 2. Experimental settings

[0107] The training set has 1501 pedestrian ids. In the process of testing and training, 751 pedestrian pictures with ids are selected as the training set, and the pictures with the remaining 750 ids are used as the test set; in the experiment, set λ th = 1, λ qt = 0.001, λ cons =0.01, β=1, learning rate 3*10 -4 , the learning rate decreases exponentially after reaching 150epoch.

[0108] 3. Training and testing methods

[0109] Training phase: Send pictures to the network in batches for training, set the batchsize to 128, and generate gradient backpropagation update (SGD) under loss supervision. After 300 Epoch iterations, the final network model is obtained.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep Hash pedestrian re-identification method, which comprises the following steps of 1, constructing a deep neural network, wherein the deep neural network comprises a feature learning module and a Hash learning module, the feature learning module adopts a Resnet network, and the Hash learning module comprises a full connection layer and a tanh function layer; 2, training the deep neural network, 1) preparing the pedestrian pictures, and 2) sending the training pictures into the deep neural network for training, including feature learning, Hash learning and loss function learning; 3) carrying out the network optimization and parameter updating; step 3, testing the deep neural network, obtaining a loose hash code through the feature learning module and the hash learning module, and then converting the loose hash code into the strict -1 and 1 codes through a symbol function, and calculating the Euclidean distance between the Hash codes corresponding to the pedestrian pictures in query and gallery to carry out feature matching. The method has the advantages that the pedestrian re-identification accuracy is improved, and the pedestrian re-identification timeis shortened.

Description

technical field [0001] The invention belongs to the technical field of pedestrian re-identification. Background technique [0002] Pedestrian re-identification is used in pedestrian tracking and criminal investigation search. In a multi-camera surveillance system, a basic task is to connect cross-camera pedestrians at different times and locations, which is pedestrian re-identification technology. Specifically, re-identification is the process of visually matching single or multiple pedestrians in different scenes based on a series of data obtained by cameras distributed in different scenes at different times. The main purpose of "pedestrian re-identification" is to determine whether a pedestrian in a camera has appeared in other cameras, that is, it is necessary to compare the characteristics of a pedestrian with other pedestrians to determine whether they belong to the same pedestrian. [0003] The main challenges of pedestrian re-identification are: the influence of ped...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/103G06V10/751G06N3/045G06N3/044G06F18/214Y02T10/40
Inventor 张磊刘方驿
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products