Image cooperative segmentation method based on salient image fusion

A collaborative segmentation and image technology, applied in the field of image processing, can solve problems such as missing foreground objects and inability to accurately segment foreground objects

Active Publication Date: 2019-03-08
HEBEI UNIV OF TECH
View PDF7 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technical problem to be solved by the present invention is to provide a collaborative image segmentation method based on saliency map fusion, to calculate multiple saliency maps for each image using information such as color, texture, and position, and to perform pixel-level fusion of multiple saliency maps to obtain The fused saliency map makes full use of the similar relationship between the foreground objects of the same group of images, optimizes the fused saliency map by using the clues between images, and obtains the optimized saliency map, and uses the optimized saliency map and global clues to perform collaborative segmentation to overcome Solved the problem that the foreground target is missing in the segmentation result of the prior art, and the foreground target cannot be accurately segmented at the boundary

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image cooperative segmentation method based on salient image fusion
  • Image cooperative segmentation method based on salient image fusion
  • Image cooperative segmentation method based on salient image fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0152] In this embodiment, an image collaborative segmentation method based on saliency map fusion, the specific steps are as follows:

[0153] In the first step, the input image group is pre-segmented:

[0154] Input image group I = {I 1 ,I 2 ,...,I sum}, for image I j ,j=1,2,...,sum Use the SLIC method for pre-segmentation, and get the superpixel sp={sp i ,i=1,2,...,n}, n is shown in formula (1),

[0155]

[0156] In formula (1), row and col are image I j The size of rows and columns, thus completing the input of 026Airshows image group in the iCoseg database, and pre-segmenting images;

[0157] The second step is to calculate the image I j The object graph obj:

[0158] For the image I in the first step above j , using the object measurement method proposed in "Measuring the objectness of imagewindows", the output image I j A group of bounding boxes and the probability of the image foreground objects contained in this group of bounding boxes, and then the proba...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of image processing, and relates to an image cooperative segmentation method based on salient image fusion, comprising the steps of inputting an image group, pre-segmenting an image; calculating an object graph obj of the image Ij; calculating a DRFI-based saliency map Sd of the image Ij, calculating a saliency map Sr based on background prior optimization of the image Ij, calculating a saliency map Ssca based on nearest neighbor optimization of the image Ij, calculating a saliency map Sdo based on DRFI and objectivity of the image Ij, and calculating a saliency map Sro based on background prior optimization and objectivity of the image Ij; calculating a significance graph S of the cooperative computational fusion. By combining with the cooperative segmentation of the data item and the smooth item, the invention overcomes the problems that the foreground object is missing in the segmentation result of the prior art, and the foreground objectcan not be accurately segmented at the boundary.

Description

technical field [0001] The technical solution of the present invention relates to the technical field of image processing, in particular to an image collaborative segmentation method based on saliency map fusion. Background technique [0002] The purpose of saliency map fusion is to fuse saliency maps obtained by multiple methods to achieve the goal of highlighting the foreground target area and suppressing the background area. Based on the saliency map fusion, the prior information provided by the saliency map is utilized for co-segmentation. A typical saliency map fusion method is that Cao et al. proposed in the paper "Self-Adaptively Weighted Co-saliency Detection via Rank Constraint" published in 2014 to use the relationship of multiple saliency clues to obtain adaptive weights, and obtain the final saliency by weighting. Figure, this method uses the same adaptive weight for all regions of the image in the weighting part, which leads to the inability to highlight the fo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/136G06T7/44G06T7/90
CPCG06T2207/20221G06T7/136G06T7/44G06T7/90
Inventor 于明郑俊刘依朱叶郝小可师硕于洋郭迎春阎刚
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products