Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian recognition method of camera network based on multi-level depth feature fusion

A camera network and depth feature technology, which is applied in the field of computer vision monitoring, can solve the problems of difficulty in extracting robust depth features of pedestrian images, inability to fully utilize depth features, and insufficient depth network training to achieve the effect of improving accuracy.

Active Publication Date: 2016-12-07
ZHEJIANG GONGSHANG UNIVERSITY +1
View PDF4 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there is currently no effective method to fully utilize multi-level deep features for pedestrian recognition tasks.
On the one hand, in the actual monitoring scene, since the number of labeled pedestrian samples is often too small to fully train the deep network, it is difficult to extract robust deep features of pedestrian pictures; on the other hand, due to the convolutional neural network The last layer of the network is the Softmax classifier. Its limitation is that it can only classify the output of the previous layer as the input feature, and cannot make full use of the depth features of different levels.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian recognition method of camera network based on multi-level depth feature fusion
  • Pedestrian recognition method of camera network based on multi-level depth feature fusion
  • Pedestrian recognition method of camera network based on multi-level depth feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to describe the present invention more specifically, the technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0021] The method of the invention includes two parts: the construction of a deep network model on the pedestrian database and the extraction and fusion of multi-level deep features. We transfer the pre-trained network parameters to the pedestrian database to help the learning of the target network on the pedestrian database, use the target network to extract multiple levels of depth features of pedestrian samples, and then use the depth features of different levels to construct multiple sets of binary classification SVM classifiers, and linearly weight the decision values ​​of these two classifiers to obtain the final classification results. Below in conjunction with accompanying drawing, the inventive method will be further described:

[0022] figure ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pedestrian identification method of a camera network based on multi-level depth feature fusion. A new network model is learnt on a pedestrian database by migrating parameters of a pre-training network to the pedestrian database, a plurality of multi-level depth features are extracted using the new network model, and then a Softmax classifier in the last layer of a convolutional neural network is replaced by an SVM classifier to achieve the purpose of making full use of the multi-layer depth features. Furthermore, the multi-level depth features are used for constructing a plurality of groups of SVM classifiers of binary classification and decision values of these binary classifiers are linearly weighted to obtain final classification results. The invention can effectively improve the accuracy of recognizing a pedestrian target by way of multi-level feature fusion in a decision-making layer of the SVM classifier.

Description

technical field [0001] The invention belongs to the technical field of computer vision monitoring, and in particular relates to a camera network pedestrian recognition method based on multi-level depth feature fusion. Background technique [0002] In recent years, camera networks have been increasingly used in video surveillance in public places such as airports, subway stations, squares, and banks. The problem of matching pedestrian targets between multiple cameras with no overlapping view fields is called the pedestrian recognition problem, and its purpose is to find one or several pedestrian targets of interest from the entire camera network. [0003] Pedestrian recognition has always been a research hotspot in the field of computer vision. Using this technology, pedestrians in surveillance videos can be automatically analyzed, thus changing the traditional way of manually monitoring a large amount of video data, greatly saving labor costs and improving video processing. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V40/10G06N3/045G06F18/2411
Inventor 王勋王慧燕严国丽
Owner ZHEJIANG GONGSHANG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products