Pedestrian re-identification method based on spatial reverse attention network

A pedestrian re-identification, space technology, applied in neural learning methods, character and pattern recognition, biological neural network models, etc., to achieve the effect of improving effectiveness and reliability

Active Publication Date: 2021-05-25
JIANGNAN UNIV
View PDF10 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In recent years, the task of pedestrian re-identification has made great progress. However, in an open outdoor environment, pedestrian images will have large differences due to the pre

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method based on spatial reverse attention network
  • Pedestrian re-identification method based on spatial reverse attention network
  • Pedestrian re-identification method based on spatial reverse attention network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] The task of person re-identification aims to find the same pedestrian under different cameras. Although the development of deep learning has brought great improvements to person re-identification, it is still a challenging task. In recent years, attention mechanisms have been widely verified to have excellent effects on person re-identification tasks, but the combined effects of different types of attention mechanisms (such as spatial attention, self-attention, etc.) still remain to be discovered.

[0040] refer to Figure 1~3 , as an embodiment of the present invention, provides a method for pedestrian re-identification based on spatial reverse attention network, including:

[0041] S1: Collect the captured pictures and divide them into training set and test set;

[0042]S2: Construct a spatial reverse attention network model based on Resnet-50, train the convolutional neural network according to the training set, and add CBAM-Pro;

[0043] It should be noted that th...

Embodiment 2

[0083] In order to verify and explain the technical effect adopted in this method, this embodiment adopts the traditional technical scheme and the method of the present invention to conduct a comparative test, and compares the test results by means of scientific demonstration to verify the real effect of this method.

[0084] In this embodiment, experiments are carried out on the three most commonly used data sets for pedestrian re-identification tasks: Marker-1501, DukeMTMC-reID, and CUHK03, using the first successful matching probability (rank-1) and average precision (mean average precision, mAP ) to evaluate the experimental results.

[0085] Among them, Marker-1501 includes 1501 pedestrians with different identities captured by 6 cameras. The data set generates 32668 pictures containing individual pedestrians through the DPM detector. They are divided into non-overlapping training / testing sets. The training set Contains 12,936 images of 751 pedestrians with different iden...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a pedestrian re-identification method based on a spatial reverse attention network. The pedestrian re-identification method comprises the steps of collecting shot pictures and dividing into a training set and a test set; constructing a spatial reverse attention network model based on Resnet-50, training a convolutional neural network according to the training set, and adding CBAM-Pro; dividing the network into two branches according to the added CBAM-Pro, executing forward learning and reverse attention at the same time, and extracting forward and reverse global features and local features; and connecting the extracted features according to channel dimensions to obtain pedestrian identification features including multiple types of features, and performing re-identification verification on the pedestrian identification features by using the test set to complete pedestrian re-identification. According to the method, multi-type pedestrian identification features are extracted based on the spatial reverse attention network, and the effectiveness and reliability of re-identification are improved.

Description

technical field [0001] The invention relates to the technical field of intelligent security, in particular to a pedestrian re-identification method based on a spatial reverse attention network. Background technique [0002] Pedestrian re-identification has a great demand in the field of intelligent security. It aims to associate the same pedestrians in different places at different times. The general method is to give a picture of a pedestrian to be retrieved, and extract the query through the trained model. According to the features of pictures and pictures in the gallery, the pictures in the gallery are sorted according to the similarity of the features, so as to retrieve pedestrian images. In recent years, the task of pedestrian re-identification has made great progress. However, in an open outdoor environment, pedestrian images will have large differences due to the presence of interference such as pose, occlusion, clothing, background clutter, and camera perspective. Th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/103G06V10/44G06V10/462G06N3/047G06N3/044G06F18/241G06F18/214
Inventor 宋晓宁王鹏冯振华
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products