Pedestrian re-identification method based on multi-scale feature cutting and fusion

A pedestrian re-identification and multi-scale feature technology, applied in the field of pedestrian re-identification based on multi-scale feature cutting and fusion, can solve the problems of degraded re-identification performance, noisy image features, loss of significant information, etc.

Inactive Publication Date: 2019-05-21
SOUTH CHINA UNIV OF TECH +2
View PDF6 Cites 39 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Most current methods ignore the influence of irrelevant background information in the process of extracting image features, so that the extracted image features have more noise; secondly, in order to solve the problem of different shooting angles and pedestrian posture changes, Most methods adopt a priori part matching strategy. After the parts are consistent, the features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method based on multi-scale feature cutting and fusion
  • Pedestrian re-identification method based on multi-scale feature cutting and fusion
  • Pedestrian re-identification method based on multi-scale feature cutting and fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0051] Such as figure 1 As shown, the implementation steps of a pedestrian re-identification method based on multi-scale deep feature cutting and fusion are disclosed. The implementation steps include: re-identification network training phase, retrieval set and candidate set descriptor extraction phase, similarity matrix calculation stage.

[0052] (1) Re-identification network training phase:

[0053] Training data preprocessing and data enhancement. For training data, RGB three-channel normalization and random horizontal flip are performed according to the mean value [0.485, 0.456, 0.406] and standard deviation [0.229, 0.224, 0.225];

[0054] Such as figure 2 As shown, the global descriptor is extracted, and the information in the feature maps of different scales of the deep network is extracted, and then the feature fusion is performed to obtain the global descriptor:

[0055] The global branch adopts the ResNet50 structure, the input is an image of 256*128*3, and the f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a pedestrian re-identification method based on multi-scale feature cutting and fusion, particularly provides pedestrian re-identification network training based on multi-scale depth feature cutting and fusion and a pedestrian re-identification method based on the network, and performs pedestrian re-identification through multi-scale global descriptor extraction and local descriptor extraction. The extraction of the global descriptor is to carry out average pooling and feature fusion on feature maps of different layers of the deep network, and the extraction of the localdescriptor is to horizontally divide the feature map of the deepest layer of the deep network into a plurality of blocks and respectively extract the local descriptors corresponding to the feature maps. In the training process, a minimum smooth cross entropy cost function and a difficult sample sampling triple cost function are used as the target training network parameters. By adopting the technical scheme of the invention, the problem of feature mismatching caused by factors such as pedestrian posture change and camera color cast in pedestrian re-identification can be solved, and the influence caused by background can be eliminated, so that the robustness and precision of pedestrian re-identification are improved.

Description

technical field [0001] The invention relates to the technical fields of computer vision and image processing, in particular to a pedestrian re-identification method based on multi-scale feature cutting and fusion. Background technique [0002] Pedestrian re-identification is a technology that matches whether two objects under different viewing angles of non-overlapping cameras are the same target, especially in security and criminal aspects, which has received extensive attention and applications. However, there are still huge challenges in pedestrian re-identification technology. Due to the fact that it is easily affected by factors such as illumination, viewing angle, and background in actual situations, the intra-class (same pedestrian) difference between pedestrians is even greater than that of inter-class (different pedestrians). discrepancy, leading to the failure of the re-identification task. In the actual pedestrian re-identification research work, it is mainly div...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04
Inventor 张昱晟黄昌正周智恒许冰媛陈曦肖芸榕
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products