Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian re-identification method based on multi-region feature extraction and fusion

A pedestrian re-identification and feature extraction technology, applied in the field of computer vision pedestrian re-identification, can solve the problem of low overall matching accuracy of the pedestrian re-identification method, and achieve the effect of reasonable design and improved matching accuracy.

Active Publication Date: 2018-12-07
ACADEMY OF BROADCASTING SCI STATE ADMINISTATION OF PRESS PUBLICATION RADIO FILM & TELEVISION +1
View PDF3 Cites 47 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, most of the existing convolutional neural networks extract the features of the whole picture, and do not make full use of the local features of the picture. These local features are more robust to the pose changes of pedestrians under different viewing angles, and are more effective for distinguishing different pedestrians. , therefore, the overall matching accuracy of existing person re-identification methods is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method based on multi-region feature extraction and fusion
  • Pedestrian re-identification method based on multi-region feature extraction and fusion
  • Pedestrian re-identification method based on multi-region feature extraction and fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] Embodiments of the present invention will be described in further detail below in conjunction with the accompanying drawings.

[0040] A pedestrian re-identification method based on multi-region feature extraction and fusion, comprising the following steps:

[0041] Step 1. Use the residual network to extract global features, and add a pedestrian identity classification module in the training phase for the extraction and optimization of global features.

[0042] The traditional convolutional neural network uses the fully connected layer to map the convolutional features to a feature vector whose dimension is equal to the number of pedestrian categories. Since all nodes in the fully connected layer are connected to all nodes in the previous layer, the number of parameters is large. like figure 1 As shown, the classification module (Classification Structure) in the present invention uses 1×1 convolution to implement feature mapping, and all neurons share weight parameter...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a pedestrian re-identification method based on multi-region feature extraction and fusion, including the following steps: using a residual network to extract global features,and adding a pedestrian identity classification module for global feature extraction and optimization in the training stage; constructing a multi-region feature extraction sub-network for local feature extraction, and carrying out weighted fusion of local features; setting a loss function including the loss of the classification module and the loss of a feature fusion module; training the networkto obtain a model, and extracting the feature vectors of a query set and a test set; and in the measuring stage, re-measuring the feature distance by using a cross-nearest neighbor method. The methodis reasonable in design. Global and local feature are effectively combined. The distance measurement method is optimized. A good pedestrian re-identification result is obtained. The overall matching accuracy of the system is greatly improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision pedestrian re-identification, in particular to a pedestrian re-identification method based on multi-region feature extraction and fusion. Background technique [0002] With the continuous improvement of video capture technology and the rapid development of large-scale data storage technology, it is possible to apply a large number of surveillance camera systems in public places. The identification and processing of pedestrians in massive surveillance videos presents a trend of rapid development. Relying only on human eyes to identify pedestrians in surveillance images is obviously very inefficient. The task of pedestrian re-identification technology is to rely on computer vision technology to solve non-overlapping surveillance fields of view. The problem of identity matching of pedestrians. [0003] There are two main categories of traditional solutions: one is feature extraction, which i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/10G06N3/045G06F18/24G06F18/214
Inventor 胡潇周芸王琳姜竹青门爱东
Owner ACADEMY OF BROADCASTING SCI STATE ADMINISTATION OF PRESS PUBLICATION RADIO FILM & TELEVISION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products