Check patentability & draft patents in minutes with Patsnap Eureka AI!

Position prior principle-based texture-color characteristic overall saliency detection method

A technology of color features and texture histograms, applied in image data processing, instruments, calculations, etc., can solve the problems of unanimous approval and human subjective evaluation.

Active Publication Date: 2016-08-31
CENT SOUTH UNIV
View PDF4 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0024] In actual processing, the saliency detection results of images with textured backgrounds and salient objects and backgrounds that have similar features sometimes cannot be agreed with different saliency detection methods, and the saliency of such images is usually determined by human subjective Consensus can be obtained in the evaluation; the saliency results obtained by the image detection of the salient object with multiple parts at the same time and the characteristics are significantly different from the background are usually consistent in various models, but not in human subjective evaluation identify with

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Position prior principle-based texture-color characteristic overall saliency detection method
  • Position prior principle-based texture-color characteristic overall saliency detection method
  • Position prior principle-based texture-color characteristic overall saliency detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0071] Such as figure 1 and figure 2 As shown, a texture-color feature global saliency detection method based on position prior, including the following steps:

[0072] Step 1: Obtain the image to be detected;

[0073] Step 2: Use the position prior method to process the image to be detected, and obtain the position saliency map A(x), color saliency map B(x) and texture saliency map C(x);

[0074] Wherein, the position prior distribution value p(x) of each pixel in the position saliency map is calculated according to the following formula:

[0075] p(x)=exp(-d(x,c) / δ 2 )

[0076] x represents the pixel in the image to be detected f(x), d(x,c) represents the distance between the pixel x and the center point of the image to be detected, δ is the standard deviation, and the value range is 0.3-0.6;

[0077] Step 3: Linearly fuse the position saliency...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a position prior principle-based texture-color characteristic overall saliency detection method. According to the method, pixel points are used as a basic unit, image color characteristics and texture characteristics are respectively extracted based on a position prior principle, saliency value of an area is calculated via a contrast ratio of the area and a whole image, saliency detection is conducted based on an overall contrast ratio, a corresponding color saliency image and a corresponding texture saliency image are generated, and three saliency images are normalized and fused into a main saliency image. Salient objects in the salient image generated via the method can be identified, the method conform to a human observation result, precision and a recall rate are improved, and clearness and identification degree of the generated salient image are raised.

Description

technical field [0001] The invention belongs to the technical field of image retrieval and image recognition, in particular to a texture-color feature global saliency detection method based on position prior. Background technique [0002] The human visual system has a strong dynamic selectivity in the process of perceiving the external environment, which is also reflected in the physiological structure and mechanism of the optic nervous system. Taking the physiological structure of the eye as an example, at about 3.5mm on the temporal side of the retinal disc, there is a small yellow area called the macula (macula lutea), and the central depression is called the fovea, where the distribution of optic nerve cells is the densest, and the perceived Visual information is most accurate. Although the fovea occupies only 0.01% of the entire visual surface, 10% of the information in the optic nerve is transmitted to the brain by the axons connected here. When people observe a scen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/40G06T7/00
CPCG06T2207/20221
Inventor 陈再良魏浩沈海澜寇宏波薛奇彭鹏廖胜辉
Owner CENT SOUTH UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More