Pedestrian re-identification method based on transfer learning and deep feature fusion

A pedestrian re-identification and deep feature technology, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve the problem of not considering the difference in data distribution, not fully utilizing the deep fusion of pedestrian local features, and the network migration effect is not ideal, etc. problems, to achieve strong resolution and improve accuracy

Active Publication Date: 2019-08-23
CETC BIGDATA RES INST CO LTD
View PDF7 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it is found in the research that most of the above pedestrian re-identification methods extract the global feature vector based on the global image of the pedestrian during the training process. At the same time, although some methods extract local features, they do not make full use of the local features o...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian re-identification method based on transfer learning and deep feature fusion
  • Pedestrian re-identification method based on transfer learning and deep feature fusion
  • Pedestrian re-identification method based on transfer learning and deep feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0080] As mentioned above, a pedestrian re-identification method based on transfer learning and deep feature fusion includes the following steps:

[0081] ① Pre-training: Pre-training the pre-training model based on ImageNet on the pedestrian re-recognition data to obtain the pedestrian re-recognition pre-training network model; specifically divided into the following steps:

[0082] (1.1) Obtain the pre-trained deep convolutional network model on the ImageNet data set, and train it on the pedestrian re-recognition data;

[0083] (1.2) When pre-training the deep convolutional neural network model on pedestrian re-identification data, only the sample annotation information is used to fine-tune the deep convolutional network model;

[0084] (1.2.1) The ResNet50 network model pre-trained on the ImageNet dataset will remove the top fully connected layer, and add two fully connected layers and a softmax layer after the maximum pooling layer;

[0085] (1.2.2) Use the tag information of the pe...

Embodiment 2

[0132] As mentioned above, a pedestrian re-identification method based on transfer learning and deep feature fusion includes the following steps:

[0133] Step S1, pre-training the pre-training model based on ImageNet on the pedestrian re-recognition data to obtain the pedestrian re-recognition pre-training network model;

[0134] Step S11: Obtain a pre-trained deep convolutional network model on the ImageNet data set, and train it on the pedestrian re-recognition data;

[0135] Step S12, when the deep convolutional neural network model is pre-trained on the pedestrian re-identification data, only the sample annotation information is used to fine-tune the network model;

[0136] Step S121, removing the fully connected layer at the top of the ResNet50 network model pre-trained on the ImageNet dataset, and adding two fully connected layers and a softmax layer after the maximum pooling layer;

[0137] Furthermore, the parameters of the two fully connected layers added are 1×1×2048, 1×1×751...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a pedestrian re-identification method based on transfer learning and deep feature fusion. The pedestrian re-identification method comprises the following steps of pre-training,human body posture correction and segmentation, feature vector, deep feature fusion, model training, model testing and result identification. According to the invention, global and local features of pedestrians are extracted by using a deep convolutional neural network; deep fusion is carried out on the two features to obtain final pedestrian feature representation; then, in the training process of the deep convolutional neural network, a pedestrian re-identification network model with a better effect is obtained by adopting a transfer learning mode, and finally, features extracted by the pedestrian re-identification network model have stronger resolution capability, so that the purpose of improving the pedestrian re-identification accuracy is achieved.

Description

Technical field [0001] The invention relates to a pedestrian re-identification method based on migration learning and deep feature fusion, and belongs to the technical field of deep learning and migration learning. Background technique [0002] Pedestrian re-recognition, the main purpose of the pedestrian matching task under the non-overlapping view domain multi-camera network, is to find out the target pedestrian captured by cameras in different positions at different times. [0003] With the development of artificial intelligence technology, pedestrian re-recognition technology in public security, image retrieval and other application scenarios has been widely concerned by research fields. However, compared with traditional biometric technologies such as face recognition and gesture recognition, pedestrian re-recognition technology is faced with factors such as low image resolution, viewing angle changes, posture changes, light changes, and occlusion due to the complex and uncont...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/103G06N3/044G06N3/045G06F18/2163G06F18/253G06F18/214Y02T10/40
Inventor 丁剑飞王进阚丹会闫盈盈曹扬
Owner CETC BIGDATA RES INST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products