Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Spatio-temporal information and deep network-based monitoring video object detection method

A deep network, surveillance video technology, applied in image data processing, instrument, character and pattern recognition, etc., can solve the problem that detection time and performance cannot meet the increasing demand.

Active Publication Date: 2018-07-20
GUANGDONG XIAN JIAOTONG UNIV ACADEMY +1
View PDF5 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, due to the influence of light, distance, occlusion, complex background and other factors, traditional methods can no longer meet the growing demand in terms of detection time and performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Spatio-temporal information and deep network-based monitoring video object detection method
  • Spatio-temporal information and deep network-based monitoring video object detection method
  • Spatio-temporal information and deep network-based monitoring video object detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. The following implementations are used to illustrate the present invention, but not to limit the scope of the present invention.

[0068] like figure 1 As shown, the monitoring video object detection method based on spatio-temporal information and deep network in this embodiment includes three parts: deep feature extraction, generation of moving object candidate frames and prediction candidate frames, and RoI classification and position adjustment. The present invention can use different deep neural networks to extract multi-scale deep features. In this example, VGG16 network and PVANET are used to extract features. VGG has 13 convolutional layers and 5 max-pooling layers, and uses the output results of these 13 convolutional layers as the input of the moving target candidate area generation part. Simila...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a spatio-temporal information and deep network-based monitoring video object detection method. The method comprises the steps of collecting and labeling a data set and traininga deep convolution neural network; extracting robust multi-scale depth characteristics by utilizing the trained deep convolution neural network; extracting a moving target candidate region based on multi-scale depth characteristics; predicting the position of a target in a next frame according to detection results in the previous and latter frames of a video; carrying out RoI normalization on a motion target candidate region and a predicted candidate region, subjecting characteristic vectors to classification and regression, and obtaining a preliminary detection result; based on the motion and prediction information, finely adjusting the obtained preliminary result and further accurately detecting the result. According to the invention, the rich spatio-temporal information contained in the video is comprehensively considered, so that redundant candidate frames are greatly reduced by means of motion and prediction. Meanwhile, the problem that a single frame-based detection result is not stable is solved. Compared with other region-based detection methods for target detection, the time accuracy and the detection accuracy are both improved.

Description

technical field [0001] The invention belongs to the technical field of computer digital image processing and pattern recognition, and in particular relates to a monitoring video object detection method based on spatio-temporal information and a deep network. Background technique [0002] Today, a large number of cameras in cities can be used to capture uninterrupted visual surveillance information of important areas. It plays an important role in urban informatization, urban intelligent transportation and urban security. According to IMS Research statistics, in 2016, the shipment of urban surveillance cameras reached 138 million pieces, and the amount of surveillance videos generated every day reached thousands of petabytes. At present, the surveillance video business has entered the era of data gushing. For example, Shenzhen currently has more than 1.34 million cameras. Currently, the use of cameras in Shenzhen has accounted for 50% of the total number of criminal cases so...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06T5/00G06T5/10G06T5/30G06T7/246G06T7/254
CPCG06T5/10G06T5/30G06T7/246G06T7/254G06T2207/20081G06T2207/20084G06T2207/20032G06T2207/10016G06T2207/30232G06T2207/20104G06V20/40G06V20/52G06V10/44G06T5/70
Inventor 钱学明汪寒雨侯兴松邹屹洋
Owner GUANGDONG XIAN JIAOTONG UNIV ACADEMY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products