Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for measuring the saliency of image objects

A measurement method and target technology, applied in the field of image target saliency measurement, can solve the problems of inconsistent visual saliency, inaccurate salient values ​​of salient factors, etc., and achieve the effect of avoiding inaccurate calculation, comprehensive calculation and accurate parameter acquisition.

Active Publication Date: 2021-06-11
郑州布恩科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The object of the present invention is: the present invention provides a method for measuring the saliency of an image target, which solves the problem that the prior art does not consider the saliency of the target in the image, resulting in inaccurate saliency values, thus not meeting the visual salience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for measuring the saliency of image objects
  • A method for measuring the saliency of image objects
  • A method for measuring the saliency of image objects

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] A method for measuring the saliency of an image target, comprising the steps of:

[0033] Step 1: Perform image-background segmentation on any target in the natural image to obtain the target saliency map, and obtain the coordinate set of the boundary position according to it;

[0034] Step 2: Continuously sample the natural image to obtain image blocks of N×N size, and combine the target saliency map to determine whether each image block belongs to the target image or the background image, and then generate the target image library and the background image library;

[0035] Step 3: After screening the image blocks closest to each boundary in the target image library and the background image library at the coordinate positions where the M boundaries are located, pair the image blocks to obtain M pairs of target and background image blocks;

[0036] Step 4: Sparsely represent the target image block and the background image block respectively, and obtain the orientation, ...

Embodiment 2

[0047] When N is 16,

[0048] (1) Given any natural image, perform manual image-background segmentation on any target to obtain a binarized saliency map, where the area where the target is located is marked as 255, the background area is 0, and the boundary part is between two between them; according to the target saliency map, obtain the coordinate set of the boundary location;

[0049] (2) Continuously sample the original natural image to obtain 16×16 image blocks, combined with the obtained target saliency map, judge whether each image block belongs to the target image or the background image and save it into the target image library and the background image library, And mark the corresponding starting coordinate position;

[0050] (3) for each boundary, M coordinate positions are set, and the image blocks closest to the boundary are paired in the screening target image library and the background image library to obtain M pairs of target and background images;

[0051] (4...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for measuring the saliency of an image object, which relates to the field of image object saliency algorithm; it comprises the following steps: 1) performing manual image-background segmentation on the object of the image to obtain a saliency map of the object, and obtaining a coordinate set of the location of the boundary; 2) After sampling the image to obtain N×N image blocks, classify them in combination with the target saliency map to obtain the target and background image block library; 3) Screen the nearest target to each boundary at the coordinate positions of the M boundaries Pair it with the background image block to obtain M pairs of target and background image blocks; 4) perform sparse representation on each image block to obtain its parameters; 5) integrate the parameters obtained in step 3 and step 4 to obtain the saliency value of any target The present invention solves the problem that the prior art does not take into account the salient factors of the target in the image, resulting in inaccurate salient values, thus not conforming to the visual salience, and achieves the effect that the saliency value of the image target conforms to the visual salience.

Description

technical field [0001] The invention relates to the field of image object saliency degree algorithms, in particular to a method for measuring image object saliency. Background technique [0002] With the development of computer vision and image processing technology, image target saliency detection algorithms emerge in endlessly, and the judgment of its detection quality has gradually become a challenge; because the saliency of the target is different in different backgrounds, there will be differences in the output of the detection algorithm. Influence, the saliency of the image object in the prior art is measured by calculating the difference between the saliency map output by the saliency algorithm and the manually calibrated saliency map, as an index to measure the amount of non-saliency information, and then evaluate the quality of the saliency map, but this method The saliency of the target in the image is not considered, resulting in inaccurate saliency results that d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/194G06T7/00G06T7/11G06T7/33
CPCG06T7/0004G06T7/11G06T7/194G06T7/33
Inventor 牛晓可王治忠王松伟朱民杰朱中美
Owner 郑州布恩科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products