Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cross-modal pedestrian re-identification method based on self-imitation mutual distillation

A pedestrian re-identification and cross-modal technology, applied in the field of image processing, can solve problems such as inability to effectively alleviate performance improvement and increase model complexity

Active Publication Date: 2021-06-08
XIAMEN UNIV
View PDF8 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition to increasing the complexity of the model, this type of method ignores the impact of redundant information within the modal on the accuracy of cross-modal pedestrian retrieval, and only directly performs one-stage feature registration, which cannot effectively alleviate the impact of differences between modalities. Barriers to Performance Improvement

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-modal pedestrian re-identification method based on self-imitation mutual distillation
  • Cross-modal pedestrian re-identification method based on self-imitation mutual distillation
  • Cross-modal pedestrian re-identification method based on self-imitation mutual distillation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The following embodiments will further illustrate the present invention in conjunction with the accompanying drawings.

[0042] Embodiments of the present invention include the following steps:

[0043] (1) The cross-modal data set contains the visible light image set and the infrared image set where p represents the identity label (ID) of the pedestrian, N p and M p Denote the total number of visible light image samples and the total number of infrared image samples, respectively. Sampling the data set, selecting eight pedestrian pictures with different IDs for each mode in each batch, and selecting four visible light images and four infrared images for each ID as the network input of the current batch;

[0044] (2) Normalize the input image, randomly crop it to a specified size (288*144), and use random flip for data enhancement;

[0045] (3) Input the visible light image to a convolution module (Head1) whose parameters are not shared, and the obtained feature ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cross-modal pedestrian re-identification method based on self-imitation mutual distillation, and relates to the field of image processing. Aiming at the defect that an existing one-stage feature registration method ignores feature distribution differences in modals and between modals, a two-stage feature registration method is provided, and the performance of cross-modal pedestrian re-identification is improved. The feature registration of the two stages comprises the following steps: 1) feature registration in a mode: obtaining prototype features of each pedestrian category in a self-simulation learning mode, and realizing feature registration in the mode by improving the similarity between all samples of the category and the prototype features; 2) feature registration between modals: adopting a mutual distillation learning method to reduce the distribution difference of samples of the same category and different modals. The discriminability of features is improved. All samples from two different modalities and the same ID learn the feature distribution of each other, so that the feature difference between the modalities is reduced. The method can be used for intelligent video monitoring, pedestrian tracking and behavior analysis, intelligent security and protection and the like.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a cross-modal pedestrian re-identification method based on self-imitation and mutual distillation, which can be used for intelligent video monitoring, pedestrian tracking and behavior analysis, intelligent security and the like. Background technique [0002] Cross-modal person re-identification has received extensive attention in recent years because of its application prospects and practical application value, and many excellent algorithms have emerged. These algorithms can be broadly classified into three categories: feature registration-based cross-modal person re-ID algorithms, image generation-based cross-modal person re-ID algorithms, and metric learning-based cross-modal person re-ID algorithms. Compared with the other two types of algorithms, the cross-modal person re-identification algorithm based on feature registration has received more attention. It achieves the goal ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/42G06K9/62G06T7/33
CPCG06T7/33G06T2207/10048G06T2207/30196G06V40/103G06V10/32G06F18/22G06F18/214
Inventor 曲延云张德茂洪铭
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products