A pedestrian comparison method based on multi-scale feature fusion

A multi-scale feature and pedestrian technology, which is applied to instruments, character and pattern recognition, computer components, etc., can solve the problems of difficult to reflect local differences, high space complexity of the method, and complicated training and calibration process, etc., to achieve unique performance and stability, reduce the computational complexity of the system, and increase the effect of space constraints

Active Publication Date: 2018-05-29
SHANGHAI JIAOTONG UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Pedestrian comparison method based on statistical features (see: Kviatkovsky, I.; Adam, A.; Rivlin, E., "Color Invariants for Person Reidentification," Pattern Analysis and Machine Intelligence, IEEE Transactionson, vol.35, no.7, pp.1622, 1634, July 2013) are usually relatively simple, stable, and low-complexity methods, and can achieve obvious results in simple scenarios. However, due to the use of histograms for feature statistics, local difference features are difficult. reflect
Pedestrian comparison method based on local feature points (see: C.Varytimidis, K.Rapantzikos, Y.Avrithis.WαSH: Weightedα-Shapes for Local Feature Detection. In Proceedings of European Conference on Computer Vision (ECCV 2012), Florence, Italy, October2012.) extracts the local feature points of pedestrians and compares them through the feature point matching algorithm. This method needs to match and calculate all the feature points or feature areas of pedestrians to obtain the similarity between pedestrians, so the complexity of the method Usually high, unable to meet real-time needs
The comparison method based on distance learning (see: Wei-Shi Zheng, Shaogang Gong, Tao Xiang. Reidentification by Relative Distance Comparison, PAMI2013, 2013, 35(3): 653-668) can compare The effect has been greatly improved, but the universality is not strong. Retraining is required for new scenarios. The training and calibration process is relatively complicated, and the space complexity of the method is high. It is still difficult to apply to the actual system at present.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A pedestrian comparison method based on multi-scale feature fusion
  • A pedestrian comparison method based on multi-scale feature fusion
  • A pedestrian comparison method based on multi-scale feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0027] Such as figure 1 As shown, it is an embodiment framework of a multi-scale fusion comparison method: at a low scale, the extracted color and contour features are cascaded to obtain fusion features; semi-supervised svm learning is performed on the fusion features, and the first pedestrian screening is performed. Get the candidate pedestrian set; at a high scale, use a comparison algorithm based on local feature points to calculate the similarity of each pedestrian in the filtered pedestrian s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a pedestrian comparison method for multi-scale feature fusion in the field of computer video processing technology. The invention combines various pedestrian features to enhance the stability and uniqueness of the comparison features in a multi-camera environment; at the same time, according to the characteristics in different Performance on the image scale, different pedestrian features will be compared on different scales: first compare and filter on a small scale, and then match the filtered pedestrians on a larger scale to ensure the comparison performance of each feature Under the premise of reducing the complexity of the method; and improving the existing texture features, a new comparison method based on salient feature points is adopted; this method introduces a semi-supervised distance learning method for distance function learning to reduce the traditional distance. The complexity of the learning algorithm training calibration process improves the matching accuracy.

Description

technical field [0001] The invention belongs to the technical field of computer video processing, and specifically relates to a comparison method for combining features of various pedestrians, first performing comparison and filtering on a small scale, and then matching target pedestrians on a larger scale. Background technique [0002] At present, pedestrian comparison technology is playing an increasingly important role in video surveillance, especially in the field of urban public security. Due to the similar characteristics of different pedestrians under the camera and the diversity of multi-camera network environments (view angle changes, illumination changes, mutual occlusion, etc.), pedestrian comparison under multi-camera faces severe challenges. Pedestrian analysis based on multi-camera has become a research hotspot in the field of computer vision. Research ideas generally include comparison algorithms based on statistical features, local feature points and distanc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/66G06K9/46
CPCG06V40/103G06F18/285G06F18/2155G06F18/22
Inventor 杨华吴佳俊董莉莉
Owner SHANGHAI JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products