Foreground segmentation method based on significance detection

A foreground segmentation and salient technology, applied in the field of computer vision, to achieve the effect of convenient follow-up processing and analysis, suppression of interference, and good real-time performance

Active Publication Date: 2013-04-03
CHERY AUTOMOBILE CO LTD
View PDF2 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to provide a foreground segmentation method based on saliency detection, which solves the problem of extracting the region of interest in target detection, adopts a multi-scale and multi-feature saliency detection algorithm for images collected by cameras, and generates full-scale images with clear outlines. The saliency map, and then use the k-means clustering algorithm for foreground segmentation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Foreground segmentation method based on significance detection
  • Foreground segmentation method based on significance detection
  • Foreground segmentation method based on significance detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] (1) Perform a 3×3 median filter on the original image to remove the influence of some impulse noise on saliency detection.

[0045] (2) Extract the color, brightness, and direction features of the original image, assuming that the original image is in RGB format, the extraction method is as follows,

[0046] a. Let r, g, and b be the three components of image red, green and blue respectively, then the brightness feature can be obtained by the following formula: I(r+g+b) / 3;

[0047] b. Convert the RGB color space to CIELAB space, and extract three color components of l, a, and b as color features;

[0048] c. Use Gabor filters in 4 directions of 0°, 45°, 90°, and 135° to filter the brightness map I respectively to obtain four direction features.

[0049] In this way, 8 feature maps are formed, and the feature map set {F i}, 1≤i≤8 represent them.

[0050] (3) The feature atlas is down-sampled at intervals of 2 scales, the scales are 1 / 2 and 1 / 4 of the original image, and...

Embodiment 2

[0059] figure 1 The flow chart of the foreground segmentation method based on saliency detection according to the present invention is provided, and its main steps are as follows:

[0060] (1) Input a color image I(x,y) in RGB format, perform 3×3 median filtering on I(x,y), and the filtered image I'(x,y) is

[0061] I'(x,y)=median(I(i+1,y+j)),1≤i≤1,-1≤j≤1

[0062] (2) Extract color, brightness, and direction features according to the following rules

[0063] a. Let r, g, and b be the three components of image red, green and blue respectively, then the brightness feature can be obtained by the following formula: I=(r+g+b) / 3;

[0064] b. Convert the RGB color space to CIELAB space, and extract three color components of l, a, and b as color features;

[0065] c. Use Gabor filters in 4 directions of 0°, 45°, 90°, and 135° to filter the brightness map I respectively to obtain four direction features.

[0066] In this way, 8 feature maps are formed, and the feature map set {F i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a foreground segmentation method based on significance detection, which comprises the following steps of: (1) inputting a color image in an RGB (Red, Green, Blue) format; (2) subjecting the color image in the RGB format to median filtering; (2) extracting color characteristics, brightness characteristics and direction characteristics to obtain color characteristic patterns, brightness characteristic patterns, and direction characteristic patterns, which form a characteristic pattern set containing eight characteristic patterns; (4) subjecting 8 patterns in the characteristic pattern set to multi-scale sampling; (5) subjecting each characteristic pattern to Gaussian filtering, and calculating the mean value of each characteristic pattern; (6) calculating Euclidean distance between the Gaussian blurred image and the mean value of each characteristic pattern; (7) obtaining color salient patterns, brightness salient patterns and direction salient patterns; (8) blending the above salient patterns to obtain a comprehensive salient pattern; and (9 ) subjecting the comprehensive salient pattern to the foreground segmentation by means of K-means clustering. The foreground segmentation method can effectively inhibit the interference of noise and background on a target and has pretty good real time performance.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a foreground segmentation method based on multi-scale and multi-feature saliency detection. Background technique [0002] With the continuous development of digital products and the Internet, more and more digital images need to be transmitted, processed and utilized. Since the foreground segmentation of important areas in the image is more conducive to the effective processing of data, how to quickly and accurately find potential information related to the target has become a research hotspot in the field of computer vision, which involves image saliency Problems with region detection. [0003] Human vision has the ability to quickly search for objects of interest, and this ability of visual attention is called visual saliency. Visual saliency is a perceptual property that makes an object, person, or pixel stand out relative to its surroundings, thereby gaining people's attentio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
Inventor 孙锐陈军刘博王继贞
Owner CHERY AUTOMOBILE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products