Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian comparison method based on multi-scale feature fusion

A multi-scale feature and pedestrian technology, which is applied to instruments, character and pattern recognition, computer components, etc., can solve the problems of difficult to reflect local differences, high space complexity of the method, and complicated training and calibration process, etc., to achieve unique performance and stability, reduce the computational complexity of the system, and increase the effect of space constraints

Active Publication Date: 2015-02-25
SHANGHAI JIAO TONG UNIV
View PDF5 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Pedestrian comparison method based on statistical features (see: Kviatkovsky, I.; Adam, A.; Rivlin, E., "Color Invariants for Person Reidentification," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.35, no. 7, pp.1622, 1634, July 2013) The features used are usually relatively simple, stable, and the complexity of the method is low, and can achieve obvious results in simple scenarios, but due to the use of histograms for feature statistics, local difference features Difficult to reflect
Pedestrian comparison method based on local feature points (see: C.Varytimidis, K.Rapantzikos, Y.Avrithis.WαSH: Weightedα-Shapes for Local Feature Detection. In Proceedings of European Conference on Computer Vision (ECCV 2012), Florence, Italy ,October 2012.) By extracting local feature points of pedestrians and comparing them through feature point matching algorithm, this method needs to match and calculate all feature points or feature areas of pedestrians to obtain the similarity between pedestrians, so the method The complexity is usually high and cannot meet real-time requirements
The comparison method based on distance learning (see: Wei-Shi Zheng, Shaogang Gong, Tao Xiang. Reidentification by Relative Distance Comparison, PAMI2013, 2013, 35(3): 653-668) can compare The effect has been greatly improved, but the universality is not strong. Retraining is required for new scenarios. The training and calibration process is relatively complicated, and the space complexity of the method is high. It is still difficult to apply to the actual system at present.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian comparison method based on multi-scale feature fusion
  • Pedestrian comparison method based on multi-scale feature fusion
  • Pedestrian comparison method based on multi-scale feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0027] like figure 1 As shown, it is an embodiment framework of a multi-scale fusion comparison method: at a low scale, the extracted color and contour features are cascaded to obtain fusion features; semi-supervised svm learning is performed on the fusion features, and the first pedestrian screening is performed. Get the candidate pedestrian set; at a high scale, use a comparison algorithm based on local feature points to calculate the similarity of each pedestrian in the filtered pedestrian set ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pedestrian comparison method based on multi-scale feature fusion, and belongs to the technical field of computer video processing. Multiple pedestrian features are fused, and stability and uniqueness of the comparison features in a multi-camera environment are enhanced; meanwhile, according to expressions of the features on different image scales, the different pedestrian features are compared on different scales; firstly, the pedestrian features are compared and filtered on the small scale, then screened pedestrians are matched on the large scale, and on the premise that the comparison performance of the features is guaranteed, the complexity of the method is lowered; existing texture features are improved, and the novel comparison method based on marked feature points is adopted. According to the method, distance function learning is conducted by the introduction of a semi-supervised distance learning method, the complexity of the training and calibration processes of a traditional distance learning algorithm is lowered, and matching accuracy is improved.

Description

technical field [0001] The invention belongs to the technical field of computer video processing, and specifically relates to a comparison method for combining features of various pedestrians, first performing comparison and filtering on a small scale, and then matching target pedestrians on a larger scale. Background technique [0002] At present, pedestrian comparison technology is playing an increasingly important role in video surveillance, especially in the field of urban public security. Due to the similar characteristics of different pedestrians under the camera and the diversity of multi-camera network environments (view angle changes, illumination changes, mutual occlusion, etc.), pedestrian comparison under multi-camera faces severe challenges. Pedestrian analysis based on multi-camera has become a research hotspot in the field of computer vision. Research ideas generally include comparison algorithms based on statistical features, local feature points and distanc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/66G06K9/46
CPCG06V40/103G06F18/285G06F18/2155G06F18/22
Inventor 杨华吴佳俊董莉莉
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products