Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian re-identification method based on multi-branch fusion model

A pedestrian re-identification and fusion model technology, which is applied in character and pattern recognition, neural learning methods, biometric recognition, etc., can solve problems such as camera angle changes, pedestrian posture changes, and low resolution of captured pictures

Active Publication Date: 2020-10-23
TONGJI UNIV
View PDF6 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Due to the uncontrollability of the actual pedestrian data collection, the actual pictures often have the following problems: (1) The resolution of the collected pictures is low and the lighting conditions may vary greatly; (2) The posture changes of pedestrians and the camera angle of view change; (3) possible occlusion
However, the existing person re-identification technology ignores the features of body parts in the horizontal direction, and thus cannot obtain effective resolution results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method based on multi-branch fusion model
  • Pedestrian re-identification method based on multi-branch fusion model
  • Pedestrian re-identification method based on multi-branch fusion model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0031] First define some variables that need to be used:

[0032] x represents the features of labeled pedestrian images in the batch data;

[0033] y represents the label of the input pedestrian image;

[0034] Q indicates the size of the queue;

[0035] L represents the number of rows in the query table;

[0036] p j Indicates the probability that the feature vector x is regarded as the jth type of person;

[0037] Represents the transposition of the kth category of the circular queue;

[0038] Indicates the transposition of the jth column of the query table;

[0039] τ represents the flatness of the probability distribution;

[0040] R j Indicates the probability that the feature vector x is regarded as the jth unlabeled pedestrian;

[0041] L oim Indicates OIM loss;

[0042] L T-batch Represents the hard sampling loss;

[0043] f(x) represents the image features extracted by the deep network;

[0044] D(x,y) represents the distance between x and y;

[004...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a pedestrian re-identification method based on a multi-branch fusion model. According to the method, a deep learning technology is used; preprocessing operations such as overturning, cutting and random erasing are carried out on the training set pictures; feature extraction is carried out through a basic network model; a plurality of branch loss functions are used for carrying out fusion joint training on the network; in the first branch and the second branch, a capsule network is used for extracting spatial relations of slices at different positions from the horizontal direction and the vertical direction; the third branch uses a capsule network to learn correlations between different channels of the obtained feature map, the fourth branch is used for learning global features, the fifth branch is used for carrying out corresponding similarity measurement; the mutual relation between different segmentation areas is considered through fusion of multiple branch models, the body part features in the horizontal direction can be effectively obtained, and then the features extracted by the network are more effective.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a pedestrian re-identification method based on a multi-tributary fusion model. Background technique [0002] Person re-identification (Person re-identification), also known as pedestrian re-identification, is a technology that uses related computer vision technology to judge whether there is a specified pedestrian from cameras with non-overlapping perspectives. It is a key component in the field of video surveillance. It was originally used as a cross-camera Track the subproblems for research. Specifically, pedestrian re-identification is to automatically find and sort pedestrians as similar to the target as possible in the image library (galleryset) obtained by other cameras of the surveillance network when the target pedestrian (probe) is known. First calculate the feature vector of each picture in the image library and the image to be queried, then calculate the distance betwee...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00G06N3/08G06T5/00G06T7/246
CPCG06N3/084G06T7/248G06V40/10G06F18/22G06F18/25G06F18/24G06F18/214G06T5/70
Inventor 黄德双李安东
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products