Image retrieval method, device and system

An image retrieval and image technology, applied in special data processing applications, instruments, electrical digital data processing, etc., can solve the problems of low retrieval efficiency, low retrieval accuracy, and long time, so as to improve retrieval efficiency and eliminate local errors. Feature matching, similarity define precise effects

Active Publication Date: 2012-03-07
UNIV OF SCI & TECH OF CHINA
View PDF4 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In order to solve the above technical problems, the object of the present invention is to provide an image retrieval method, device and sys

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image retrieval method, device and system
  • Image retrieval method, device and system
  • Image retrieval method, device and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] see figure 1 As shown, a schematic flow chart of the image retrieval method provided in this embodiment, the method includes the following steps:

[0040] Step S101, extracting local features of the query image;

[0041] Step S102, quantifying the local features into visual words;

[0042] Wherein, the visual word may be defined in the following manner: extracting local features from training images, clustering the local features, and determining a visual word as the center of each cluster. Multiple visual words can be obtained through multiple clustering centers, and a visual word codebook containing multiple local features, visual words and their corresponding relationships can be generated. Since visual words do not have explicit semantic information, the size of the visual word codebook, that is, the number of visual words contained in it, can generally be determined empirically with experimental test data. In addition, in this embodiment, a visual word codebook ...

Embodiment 2

[0062] This embodiment provides an implementation of spatially encoding the relative spatial positional relationship of local features in an image, specifically as follows:

[0063] Describe the relative positional relationship in the horizontal direction of the two matching local features, and obtain the horizontal spatial code map;

[0064] The relative positional relationship in the vertical direction of the two matching local features is described, and the vertical spatial code map is obtained.

[0065] For a query image or matching image, two spatial code maps are generated, denoted as X-map and Y-map respectively. Among them, X-map is used to describe the relative spatial relationship between two matched local features along the horizontal direction (X-axis), and Y-map is used to describe the vertical direction (Y-axis) between two matched local features. -axis) relative spatial position relationship. For example, given a query image I, there are K matching local featu...

Embodiment 3

[0086] This embodiment provides an implementation method for performing a spatial consistency check on the spatial coordinate positions of the matching local feature pairs in the query image space code map and the matching image space code map, which may be specifically described as follows:

[0087] According to the quantization of matching local features, if the query image I q and matching image I m If there are N pairs of matching local features, then in step S104 described in Embodiment 1, these matching local features are respectively in I q and I m The relative spatial position in is spatially coded to obtain the query image space code map (GX q , GY q ) and matching image space code map (GX m , GY m ). In order to compare the consistency of the relative spatial positions between the matching local features between the query image and the matching image, the GX q and GX m , GY q and GY m Execute XOR operation, as shown in formula (10) and formula (11):

[008...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image retrieval method, an image retrieval device and an image retrieval system, wherein the image retrieval method comprises the following steps: extracting the local features of a query image, and quantizing the local features into visual words; querying a preset visual-word inverted list in an image database by using the visual words so as to obtain matched local-feature pairs and matched images; respectively carrying out space encoding on relative space positions between matched local features in the query image and the matched images so as to obtain a space code picture of the query image and space code pictures of the matched images; executing a space consistency check on the space code picture of the query image and the space code pictures of the matched images so as to obtain the number of the matched local-feature pair in conformity with the space consistency; and according to the numbers of the matched local-feature pairs (in conformity with the space consistency) of different matched images, returning to the matched images according to the similarity of the matched images. By using the method provided by the invention, the image retrieval accuracy and the retrieval efficiency can be improved, and the time consuming for retrieval can be reduced.

Description

Technical field: [0001] The invention relates to the field of data retrieval, in particular to an image retrieval method, device and system. Background technique: [0002] In recent years, with the rapid development of Internet technology and the rapid popularization of digital devices, images on the Internet have reached a scale of hundreds of billions, and have been growing exponentially. Faced with such a large amount of image data, how to effectively manage it so that users can easily find the images they are interested in, that is, image retrieval, is a very practical and challenging task. In image retrieval, a research hotspot is partially replicated image retrieval. Partially copied images are generally caused by the user cutting out a piece from the original image and pasting it into another image, or adding some text to the original image, or performing a simple projection transformation on the original image. Based on partial copy image retrieval, it has a broad ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/30
Inventor 周文罡李厚强田奇卢亦娟
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products