Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for measuring significance of image target

A measurement method and target technology, applied in the field of image target saliency measurement, can solve the problems of inconsistent visual saliency, inaccurate salient values ​​of salient factors, etc., and achieve the effect of avoiding inaccurate calculation, comprehensive calculation and accurate parameter acquisition.

Active Publication Date: 2018-03-09
郑州布恩科技有限公司
View PDF10 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The object of the present invention is: the present invention provides a method for measuring the saliency of an image target, which solves the problem that the prior art does not consider the saliency of the target in the image, resulting in inaccurate saliency values, thus not meeting the visual salience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for measuring significance of image target
  • Method for measuring significance of image target
  • Method for measuring significance of image target

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] A method for measuring the saliency of an image target, comprising the steps of:

[0033] Step 1: Perform image-background segmentation on any target in the natural image to obtain the target saliency map, and obtain the coordinate set of the boundary position according to it;

[0034] Step 2: Continuously sample the natural image to obtain image blocks of N×N size, and combine the target saliency map to determine whether each image block belongs to the target image or the background image, and then generate the target image library and the background image library;

[0035] Step 3: After screening the image blocks closest to each boundary in the target image library and the background image library at the coordinate positions where the M boundaries are located, pair the image blocks to obtain M pairs of target and background image blocks;

[0036] Step 4: Sparsely represent the target image block and the background image block respectively, and obtain the orientation, ...

Embodiment 2

[0047] When N is 16,

[0048] (1) Given any natural image, perform manual image-background segmentation on any target to obtain a binarized saliency map, where the area where the target is located is marked as 255, the background area is 0, and the boundary part is between two between them; according to the target saliency map, obtain the coordinate set of the boundary location;

[0049] (2) Continuously sample the original natural image to obtain 16×16 image blocks, combined with the obtained target saliency map, judge whether each image block belongs to the target image or the background image and save it into the target image library and the background image library, And mark the corresponding starting coordinate position;

[0050] (3) for each boundary, M coordinate positions are set, and the image blocks closest to the boundary are paired in the screening target image library and the background image library to obtain M pairs of target and background images;

[0051] (4...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for measuring the significance of an image target, relating to the field of significance algorithms of image targets. The method comprises the following steps: (1) carrying out manual image-background division on the image target, so as to obtain a target significance map and coordinate sets of positions of borders; (2) sampling an image to obtain N*N image blocks,and classifying the N*N image blocks by combining with the target significance map, so as to obtain a target and background image block library; (3) respectively screening and separating a target andbackground image block closest to each border at coordinate positions of M borders, and carrying out pairing, so as to obtain M pairs of target and background image blocks; (4) carrying out sparse representation on each image block, so as to obtain parameters of each image block; and (5) integrating the target and background image blocks obtained in the step (3) with the parameters obtained in the step (4), so as to obtain a significance value of an arbitrary target. According to the method, the problems that the significance value is inaccurate when significance factors of the target in theimage are not considered and is not accordant with visual significance in the prior art are solved, and the significance value of the image object is accordant with the visual significance.

Description

technical field [0001] The invention relates to the field of image object saliency degree algorithms, in particular to a method for measuring image object saliency. Background technique [0002] With the development of computer vision and image processing technology, image target saliency detection algorithms emerge in endlessly, and the judgment of its detection quality has gradually become a challenge; because the saliency of the target is different in different backgrounds, there will be differences in the output of the detection algorithm. Influence, the saliency of the image object in the prior art is measured by calculating the difference between the saliency map output by the saliency algorithm and the manually calibrated saliency map, as an index to measure the amount of non-saliency information, and then evaluate the quality of the saliency map, but this method The saliency of the target in the image is not considered, resulting in inaccurate saliency results that d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/194G06T7/00G06T7/11G06T7/33
CPCG06T7/0004G06T7/11G06T7/194G06T7/33
Inventor 牛晓可王治忠王松伟朱民杰朱中美
Owner 郑州布恩科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products