Pedestrian re-recognition method and system based on multi-channel consistency features

A pedestrian re-identification and consistency technology, applied in the field of deep learning, can solve problems such as inability to re-identify pedestrians with high precision

Active Publication Date: 2018-05-29
ZHEJIANG UNIV
View PDF6 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The semantic attribute structure information of pedestrians and the color texture distribution information of pedestrian appearance are the basic information contained in the image. For the task of pedestrian re-identification, due to the large number of scenes and the huge size of pedestrians, there are often some scenes where pedestrians have similar color texture distribution

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-recognition method and system based on multi-channel consistency features
  • Pedestrian re-recognition method and system based on multi-channel consistency features
  • Pedestrian re-recognition method and system based on multi-channel consistency features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0075] A pedestrian re-identification method based on multi-channel consistency features, comprising the following steps:

[0076] Step 1: Input N image pairs to be matched including training data and test data and its corresponding label l n , where n=1,...,N.

[0077] The second step: extracting the semantic feature representation and the color texture spatial distribution feature representation of the image data input in the first step, specifically including the following steps:

[0078] 1) Extract the semantic feature representation of the image data:

[0079]

[0080] in, is the semantic feature representation of the input image pair, f CNN Indicates the convolution operation, is the parameter to be learned;

[0081] 2) Extract the spatial distribution characteristics of image data in each channel of RGB, HSV (color information), SILTP (texture information), and perform feature extraction through a convolutional neural network composed of three convolutional ...

Embodiment 2

[0105] A pedestrian re-identification system based on multi-channel consistency features, including the following modules:

[0106] The image data input module is used to input N image pairs to be matched including training data and test data and its corresponding label ln , where n=1,...,N;

[0107] The feature representation extraction module is used to extract the semantic feature representation and color texture spatial distribution feature representation of the image data input by the image data input module;

[0108] A consistent feature representation module, configured to obtain a consistent feature representation of the semantic feature representation and color texture spatial distribution feature representation through multi-scale feature matching;

[0109] The probability representation output module is used to construct a binary classifier for the consistent feature representation obtained by the consistent feature representation module, and output a probability ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of image processing, and relates to a pedestrian re-recognition method based on multi-channel consistency features. The method includes the following stepsthat N to-be-matched image pairs of training data and test data and labels ln corresponding to the image pairs are input, wherein n equals to 1,...,N; semantic feature representations of the input image data and color texture spatial distribution feature representations are extracted; consistency feature representations of the semantic feature representations and the color texture spatial distribution feature representations are obtained through multi-scale feature matching; a binary classifier is built for the obtained consistency feature representations, and a probabilistic representation for the same target is output. The method has the advantages that pedestrian recognition is carried out through comprehensive pedestrian image semantic attributes and color distribution features, and the method is high in precision, stable in performance and suitable for solving the problem of pedestrian re-recognition in a complex scene.

Description

technical field [0001] The invention belongs to the technical field of image processing, and relates to a pedestrian re-identification method based on multi-channel consistency features, in particular to a deep learning method for pedestrian re-identification combined with image semantic consistency features and color texture distribution consistency features. Background technique [0002] The task of pedestrian re-identification is to deal with the problem of cross-camera pedestrian matching. The application of this technology in the pedestrian monitoring network is reflected in pedestrian tracking, human body retrieval, etc., and has extremely huge application scenarios in the field of public security. Pedestrian semantic attribute information and pedestrian color texture distribution information are complementary to a certain extent, and they are two aspects of describing pedestrians. Combining the two features for pedestrian re-identification can make up for the defect of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04
CPCG06V40/10G06V10/56G06N3/045G06F18/22G06F18/241
Inventor 毛超杰李英明张仲非
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products