Method for adaptively quantizing optical flow features on complex video monitoring scenes

An adaptive quantization and video monitoring technology, applied in TV, color TV, closed-circuit TV system, etc., can solve the problems of increasing data volume, not considering the motion distribution characteristics of video monitoring scenes, loss of spatial position and direction resolution, etc., to achieve discriminative effect

Inactive Publication Date: 2014-06-18
SHANGHAI JIAO TONG UNIV
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] This fixed quantization method makes it face the following three problems: 1) Reducing the quantization accuracy will cause the loss of spatial position and direction resolution; 2) Improving the quantization accuracy will reduce the loss, but will increase the amount of data; 3) Using a unified Quantization accuracy, without considering the actual motion distribution characteristics in the video surveillance scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for adaptively quantizing optical flow features on complex video monitoring scenes
  • Method for adaptively quantizing optical flow features on complex video monitoring scenes
  • Method for adaptively quantizing optical flow features on complex video monitoring scenes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] The video sequence used in this implementation comes from the traffic database QMUL (The Queen Mary University of London) with a frame rate of 25pfs and a resolution of 360×288. figure 2 For video surveillance scenarios. The QMUL database comes from Queen Mary, University of London, and is a database dedicated to the analysis of complex video surveillance scenarios.

[0037] Such as figure 1 As shown, this embodiment includes the following specific steps:

[0038] Step 1: Probabilistic denoising of optical flow features. The specific steps are:

[0039] 1.1) Count the number of optical flow features generated on each spatial point (x, y) in the video scene, and perform normalization processing: Among them: P(x,y) represents the occurrence probability of the optical flow feature at the spatial point (x,y); A(x i ,y i )surface Com ( a , b ) = ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of digital image processing and relates to a method for adaptively quantizing optical flow features on complex video monitoring scenes. According to the method, the local statistical features are calculated after probability denoising is performed on video space based on the optical flow features, then, the video space position is adaptively quantized, and the video space is divided into a plurality of micro-block areas; finally, each micro-block area is filtered through a motion complexity threshold value, the quantization number is judged, a visual dictionary is generated, and adaptive quantization is achieved. According to the method, the effectiveness and the diversity of motion on the video monitoring scenes are described based on the local statistical features of optical flow. The effective pixel ratio and the motion complexity features are fused, the liveliness of local motion is described, and then the optical flow feature position can be adaptively quantized. On the basis of the motion complexity features, the diversity of the local motion is described, and then the optical flow feature direction can be adaptively quantized. Better discriminability can be played through adaptive quantization of the optical flow features in the next scene analysis based on a word bag model.

Description

technical field [0001] The invention relates to a method in the technical field of digital image processing, in particular to an adaptive quantization method for optical flow features in complex video monitoring scenes. Background technique [0002] Computer vision technology is becoming more and more important in intelligent video surveillance, such as traffic flow monitoring, event detection or congestion detection. Behavior analysis is a fundamental task in computer vision applications. However, given the complexity of environmental conditions, such as light and weather changes, crowding, etc., behavior analysis still faces some challenges. The current research on behavior analysis is mainly divided into two categories. One is the method based on object tracking features. However, in complex scenes, there is still a lack of reliable multi-target tracking algorithms. In addition, it is difficult for tracking algorithms to adapt to sudden changes in motion in complex sc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20H04N5/21H04N7/18
Inventor 樊亚文郑世宝吴双
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products