Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video-based flame detection method

A flame detection and video technology, applied in the field of fire detection, can solve the problems of single flame, low anti-interference, high false alarm rate and false alarm rate

Active Publication Date: 2018-04-20
ZDST COMM TECH CO LTD
View PDF4 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These flame detection methods only consider the local characteristics of the flame, which is easy to cause misjudgment, and is only suitable for flame detection in certain specific scenarios
[0004] There are already many video-based flame detection methods in the prior art, but most of them judge the flame too single, have low anti-interference, and have high false alarm rate and false negative rate. Higher and more adaptable approach to video flame detection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video-based flame detection method
  • Video-based flame detection method
  • Video-based flame detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments. In the video-based flame detection method of the present invention, firstly, according to the characteristics of flame pixels in RGB and YCbCr color spaces, all candidate flame points satisfying the conditions in the video image are found out. Then, the video image is divided into blocks, the covariance matrix corresponding to the color and brightness attributes of the flame pixel set in the video block is calculated, and the upper triangle or lower triangle part of the color and brightness covariance matrix is ​​extracted as a feature vector. Randomly select a large number of positive and negative samples as the training set, train based on the adaptive enhanced AdaBoost (Adaptive Boosting) classifier, and establish the corresponding classification model. Finally, calculate the feature vector of the candidate flame point video block, input the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video-based flame detection method, and solves the technical problem in improving the flame detection accuracy. The video-based flame detection method comprises the followingsteps of obtaining a video image sequence; performing image preprocessing; performing color detection; performing flame feature extraction; and performing AdaBoost prediction. Compared with the priorart, the method has the advantages that all candidate flame points meeting a condition in a video image are found out according to characteristics of flame pixels in RGB and YCbCr color spaces; then,the video image is subjected to block segmentation; covariance matrixes corresponding to color and brightness attributes of flame pixel sets in video blocks are calculated; upper or lower triangularparts in the color and brightness covariance matrixes are extracted as eigenvectors; an obtained eigenvector set serves as an input of an AdaBoost classifier; the eigenvectors are input to an AdaBoostclassification model; an output is a judgment whether a fire happens or not; the fire happening situation can be detected out in real time; the false alarm rate can be reduced; and relatively high accuracy and strong robustness are achieved.

Description

technical field [0001] The invention relates to a safety monitoring method, in particular to a fire detection method. Background technique [0002] The existing automatic fire monitoring method mainly uses smoke sensors, temperature sensors, and infrared detectors for detection. Great disaster. [0003] The research methods of early flame detection are mainly reflected in the color model of flame. Chen et al. worked out rules that could distinguish flame pixels based on the rules between red, green, and blue RGB channels, and extracted the flame area. Celik et al. established a general rule-based classification of flame pixels on the color model of brightness blue chroma red chroma YCbCr (a color space, Y refers to the brightness component, Cb refers to the blue chroma component, Cr refers to the red chroma component) color model color model. Marbach et al. used the YUV (Y stands for Luminance, UV stands for Chrominance, which are two components of color) color model to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/40G06K9/46G06K9/62
CPCG06V20/41G06V10/30G06V10/56G06F18/2148G06F18/2411
Inventor 周美兰王元鹏
Owner ZDST COMM TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products