Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian re-identification implementation method based on active comparative learning

A technology for pedestrian re-identification and implementation method, applied in the field of pedestrian re-identification based on active contrastive learning, can solve the problem of high requirements for pedestrian fine-grained features, and achieve the effect of solving the problem of false labels

Pending Publication Date: 2021-11-16
SOUTHWESTERN UNIV OF FINANCE & ECONOMICS +1
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the particularity of the pedestrian re-identification dataset, the fine-grained features of pedestrians are highly required. To achieve high retrieval accuracy only by active learning, a large number of labeled samples are still required for model training.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification implementation method based on active comparative learning
  • Pedestrian re-identification implementation method based on active comparative learning
  • Pedestrian re-identification implementation method based on active comparative learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] like figure 1 , The present embodiment provides a comparison based on active learning heavy pedestrian identification implemented method of general process is as follows:

[0043] S1: First, based on a predicted loss by active learning module to select samples of high value, the number of each selected set of active learning is B t ;

[0044] S2: Select a pedestrian and then manually marking the sample via give pedestrians tag ID;

[0045] S3: The labeled sample is then defined into comparative sample learning module, with the same tag ID as the positive sample, the sample ID is marked as inconsistent definitions negative samples, and then optimize the contrast loss, characterized in that the same sample ID pedestrian distribution others close distance less; sample ID different distributions longer, the greater the distance;

[0046] S4: The test set weight input pedestrian recognition task completed training learning module contrast, for test results.

Embodiment 2

[0048] like figure 2 As shown in, for example, based on further refinement of the embodiment 1 of the present embodiment, there is provided a pedestrian based on active learning weight Comparative identify the particular method, comprising the steps of:

[0049] The S1, a high value to select the sample based on a predicted loss of active learning modules, specifically:

[0050] S1.1: The active learning module is broken down into modules and losses predicted target prediction modules. Wherein a plurality of intermediate target wherein the prediction module layer, an output layer and a layer composed Softmax, tag prediction for an unknown input data samples labeled pedestrian; loss prediction module by a plurality of functional layers, FC layers, wherein for the functional layer wherein the intermediate result of the processing target layer generated prediction module, means for generating the predicted loss.

[0051] S1.2: In active learning module, by the loss of the prediction ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pedestrian re-identification implementation method based on active comparative learning, and relates to the field of computer vision in artificial intelligence. The system comprises an active learning module and a comparative learning module. Firstly, a high-value sample is selected through an active learning module based on loss prediction, the number of active learning selected each time is set as Bt, and then manual marking is performed to obtain a pedestrian mark ID. Then the marked samples are sent to a comparison learning module, the samples with the same mark ID are defined as positive samples, the samples with inconsistent mark IDs are defined as negative samples, and then comparison loss is optimized, so that pedestrian sample feature distribution with the same ID is closer, and the distance is smaller; and the sample features of different IDs are distributed farther, and the distance is larger.

Description

Technical field [0001] The present invention belongs to computer visual fields in artificial intelligence, involving a method of re-recognizing implementation based on active comparison learning. Background technique [0002] In the computer visual field, pedestrians reread one of the most important tasks, that is, in the case where a pedestrian monitoring image, the task of retrieving the pedestrian image is required in a large number of pedestrian images of cross-monitoring devices. In addition, pedestrians are often combined with pedestrian tracking techniques, have a wide range of applications in smart video surveillance, intelligent security. However, due to the differences between different monitoring equipment and the dynamics of pedestrians, the pedestrian is difficult to recognize, and the effect is not effective. At present, the research of pedestrian reintegration is mainly divided into full supervision learning, half supervisory learning, and three kinds of non-superv...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N20/00
CPCG06N20/00G06F18/214
Inventor 刘贵松解修蕊郑余黄鹂杨新蒋太翔
Owner SOUTHWESTERN UNIV OF FINANCE & ECONOMICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products