Image retrieval method based on neighborhood rotation right angle mode

An image retrieval, right-angle technology, used in character and pattern recognition, special data processing applications, instruments, etc., can solve problems such as loss of partial image information, low recall and precision.

Active Publication Date: 2017-05-10
HARBIN UNIV OF COMMERCE
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention is to solve the problem that the spherically symmetrical three-dimensional local ternary mode SS-3D-LTP method used in the prior art has limitations in the selection of the threshold, and the selection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image retrieval method based on neighborhood rotation right angle mode
  • Image retrieval method based on neighborhood rotation right angle mode
  • Image retrieval method based on neighborhood rotation right angle mode

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0035] Specific embodiment one: a kind of image retrieval method based on neighborhood rotation right angle pattern comprises the following steps:

[0036] Step 1: Separate the three-channel colors of the color image R, G, and B, select the haar wavelet base, and perform two-dimensional discrete wavelet transform on the three-channel colors respectively, and take the LL low-frequency subbands of the three-channel colors, namely cA_R, cA_G, and cA_B The low-frequency sub-band is used as the selected plane of the experiment; the R is a red component, G is a green component, B is a blue component, cA_R is a red component low-frequency sub-band, cA_G is a green component low-frequency sub-band, and cA_B is a blue component low-frequency sub-band bring;

[0037] Step 2: Based on the VLBP mode (Volume Local Binary Patterns volume local binary mode), the local pattern is extracted according to the plane selected in step 1;

[0038] Step 3: Using the neighborhood right-angle rotation m...

specific Embodiment approach 2

[0040] Specific embodiment two: the difference between this embodiment and specific embodiment one is: the specific process of extracting the local mode based on the VLBP mode in the step two according to the plane selected in the step one is:

[0041] Arrange the selected planes vertically, the order is that the first plane is the low frequency subband of the red component, the second plane is the low frequency subband of the blue component, and the third plane is the low frequency subband of the green component; A 3×3 pixel matrix formed by each pixel as the center, that is, a matrix formed by the center of the pixel and its surrounding eight neighboring pixels; the pixels around the plane expand outward, and its value is equal to its own pixel value;

[0042] The three planes are equivalent to a cube, that is, the 3×3 pixel matrix on the three planes is also equivalent to a cube, and the local mode of each 3×3 pixel matrix cube is extracted in turn, and the extracted local m...

specific Embodiment approach 3

[0044] Embodiment 3: The difference between this embodiment and Embodiment 1 or 2 is that in the step 3, the local pattern extracted in step 2 is calculated using the neighborhood right-angle rotation pattern of the local pattern. The specific process is as follows: :

[0045] Use the neighborhood rotation right-angle mode to calculate the extracted local mode to obtain a local three-valued mode, decompose each local three-valued mode into two local binary modes, and form a total of 10 local binary values ​​​​encoded by 0,1 mode, the weighted operation is performed on the 10 local binary modes to obtain the values ​​of 10 neighborhood rotation right-angle modes. The feature vector extraction process is as follows figure 1 shown.

[0046] Other steps and parameters are the same as those in Embodiment 1 or Embodiment 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an image retrieval method based on a neighborhood rotation right angle mode, and solves the problems that the prior art, a recall ratio and a precision ratio are low as threshold selection is limited and part of image information is lost in gray scale selection. The method comprises the following steps: I, separating three channel colors of R, G and B of a colorful image, and conducting two-dimensional discrete wavelet transform on the three channel colors respectively, and taking low frequency sub-bands of the three channel colors as a selected plane; II, based on a VLBP mode, extracting a local mode of the plane selected in step I; III, calculating a neighborhood rotation right angle mode value of the local mode according to the neighborhood rotation right angle mode; and III, conducting feature similarity measurement on a feature vector formed by the neighborhood rotation right angle mode value and an image data base, and evaluating an image retrieval result by using a recall ratio and a precision ratio. The invention is applied to the field of image retrieval.

Description

technical field [0001] The invention relates to an image retrieval method based on a neighborhood rotation right-angle pattern. Background technique [0002] The concept of content-based image retrieval (CBIR) was proposed by T. Kato in 1992. In his thesis, he built an image database based on color and shape, and provided certain retrieval functions for experiments. Since then, the term has been used to describe image features that require a large number of images for the retrieval process. The core of CBIR technology is to use the visual features of images to retrieve images. It is essentially a similarity matching technique. At present, there are many kinds of features used as indexes, which can be roughly divided into two categories: low-level visual features and high-level semantic features. Low-level visual features mainly include color, texture, and shape; high-level semantic features refer to the meaning of images, which include human recognition and understanding...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/30G06K9/62
CPCG06F16/5838G06F16/5862G06F18/2193
Inventor 赵志杰孙华东洪天昊金雪松张立志张艳荣陈铭范智鹏
Owner HARBIN UNIV OF COMMERCE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products