Space-time classification-based dynamic background differential detection method, system and device

A dynamic background and detection method technology, applied in image analysis, image data processing, instruments, etc., can solve the problems of reducing the detection accuracy rate and affecting the detection effect, and achieve the effect of large sampling range, enhanced ability and strong representativeness

Active Publication Date: 2018-01-12
SUN YAT SEN UNIV
View PDF7 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Methods based on local texture features segment the foreground object and background according to the texture smoothness of different components in the video scene. The limitation of this type of method is that it needs to manually design features with good discrimination
Therefore, the background difference method combined with spatial neighborhood information (ie, the second type of method) is useful for dynamic backg

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Space-time classification-based dynamic background differential detection method, system and device
  • Space-time classification-based dynamic background differential detection method, system and device
  • Space-time classification-based dynamic background differential detection method, system and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0096] The present invention proposes a new dynamic background difference detection method based on spatio-temporal classification. The method adopts the group sampling method when establishing the background model, while the prior art directly uses continuous video frames to initialize the background model. Therefore, the present invention The method adopted can obtain more representative pixel samples, and can better represent the dynamic background; the present invention distinguishes the categories of neighboring pixels in the spatial classification step, and only uses similar pixels to further determine whether the central pixel is the real foreground pixels, while the prior art uses all the neighboring pixels to describe the background pixels, if some of the neighboring pixels are foreground pixels, it will be wrongly described as background pixels and affect the detection effect, so the method adopted in the present invention It can effectively improve the accuracy of mo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a space-time classification-based dynamic background differential detection method, a system and a device. The method comprises the steps of establishing a background model corresponding to each pixel in an image through packet sampling in a time sequence, classifying pixels in the background model according to to-be-detected pixels and obtaining a rough foreground mask image; with a foreground pixel point in the rough foreground mask image as a center, classifying pixel points in a preset neighboring domain range of the central pixel point, and correcting the central pixel point to be a background pixel point or still maintaining the central pixel point as a foreground pixel point according to the number of pixel points the same type with the central pixel point within the preset neighboring domain range of the central pixel point and belonging to background pixel points. According to the invention, the packet sampling method is adopted, so that the ability ofthe dynamic background description is enhanced. Only pixel points the same type of the central pixel point are adopted to judge whether a foreground pixel point is a real foreground pixel point or not. Therefore, the accuracy of detection is improved. The method can be widely applied to the field of moving target detection.

Description

technical field [0001] The invention relates to the field of moving target detection, in particular to a dynamic background difference detection method, system and device based on spatiotemporal classification. Background technique [0002] Moving target detection is the basis of target recognition, tracking and subsequent understanding of object behavior, and is a research hotspot in the field of computer vision. The background subtraction method (also known as the background subtraction method) is the most commonly used method in moving object detection. The basic principle is to realize the moving object detection by making a difference between the current frame image and the background image. The background difference method is fast, accurate and easy to implement when detecting moving targets. The key is the acquisition of the background image. In practical applications, affected by factors such as sudden changes in illumination, fluctuations of some objects in the act...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/215G06T7/246G06K9/00
Inventor 李熙莹李国鸣
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products