Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feedback background extraction-based city traffic scene foreground target detection method

A technology for urban traffic and background extraction, which is applied in image data processing, instruments, calculations, etc., to achieve the effect of realizing significant foreground detection

Pending Publication Date: 2018-01-09
XIAN UNVERSITY OF ARTS & SCI
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The object of the present invention is to provide the urban traffic scene foreground target detection method based on feedback background extraction, and FViBe method is better than other methods in dealing with the problem of slow motion or temporarily being in static state target, and FViBe method is effective and in complex urban traffic The salient foreground detection can be realized in real time in the scene to solve the problems raised in the above-mentioned background technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feedback background extraction-based city traffic scene foreground target detection method
  • Feedback background extraction-based city traffic scene foreground target detection method
  • Feedback background extraction-based city traffic scene foreground target detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] The present invention provides a technical solution: a method for detecting a foreground object in an urban traffic scene based on feedback background extraction, comprising the following steps:

[0033] Step 1: First use the position of each pixel in a group of samples to establish a background model, and use interval frames to initialize the model. In step S1, FViBe uses the latest observed image value v M The set N of (x, y), M ∈ [1, N] representing any pixel is used to describe the background model M(x, y), which uses the most recent observation at a specified time interval at any position (x, y) to initialize the image values ​​as follows:

[0034] M(x,y)={v 1 (x,y),v 2 (x,y),...,v N (x,y)} = {I 1 (x,y),I 1+K (x,y),...,I 1+(M-1)×K (x,y),...,I 1+(N-1)×K (x,y)} (1)

[0035] In the formula, N represents the number of samples in the model, K represents the time interval in the actual scene, and I 1 Refers to frame 1, I 1+(N-1)×KRefers to the 1+(N-1)×K frame. ...

Embodiment 2

[0059] The present invention provides a specific example to illustrate: the urban traffic data of changes in weather and lighting conditions is provided by the local traffic police detachment of Jining City, Shandong Province, and is provided by charge-coupled devices installed at three different intersections between 7:00 am and 10:00 am. The five detection methods collected by the camera and compared with them are GMM, ALW, SDC, ViBe and PBAS. The performance of these methods is compared according to the author's settings or default parameters, and GMM, ViBe and PBAS are The test is based on the open source computer vision library (OpenCV). In order to deal with the slow-moving or temporarily stopped vehicles in the TLD, the comparison results of four representative frames are given. Several methods involved in the comparison are the same as those in this paper. The detection results of the proposed method are shown in the third row to the eighth row of the figure. The first ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a feedback background extraction-based city traffic scene foreground target detection method. The method comprises the steps of S1, firstly building a background model by adopting the position of each pixel in a group of samples, and initializing the model by utilizing an interval frame; S2, installing a few counters and performing adaptive updating to describe a current traffic state and pixel stability; and S3, detecting a significant foreground, and updating the model by using an adaptive feedback learning rate. According to the feedback background extraction-based city traffic scene foreground target detection method, an FViBe method is superior to other methods in effect of handling the problem that a target slowly moves or is temporarily in a stationary state;and the FViBe method is effective and can realize significant foreground detection in real time in a complex city traffic scene.

Description

technical field [0001] The invention relates to the technical field of urban traffic scenes, in particular to a method for detecting foreground objects in urban traffic scenes based on feedback background extraction. Background technique [0002] Extracting salient prospects from complex urban traffic scenes is the focus of building intelligent transportation systems, and has a wide range of applications in urban public safety. Video surveillance researchers have extensively studied automatic salient foreground detection techniques, and proposed various salient foreground detection methods such as optical flow method, inter-frame difference method, and background subtraction model. Optical flow technology can be well applied to moving or static state However, it is susceptible to noise and has high computational cost, so this method is rarely used for real-time salient foreground detection. The inter-frame difference method performs difference operations on two adjacent fram...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/215G06T7/254
Inventor 李浩张运胜
Owner XIAN UNVERSITY OF ARTS & SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products