Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Characteristic point matching method based on relativity measurement

A technology of correlation measurement and feature point matching, applied in image data processing, instrumentation, computing, etc., can solve problems such as poor matching performance, high time complexity and space complexity, and image noise is not robust

Inactive Publication Date: 2009-07-22
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the method based on the image gradient, the matching performance of the relatively simple descriptor is not good, and the time complexity and space complexity of the descriptor calculation with the better matching performance are relatively high; the descriptor based on the image differentiation is more complex in structure, And the outstanding shortcoming is that it is not robust to image noise, and real images often have noise, so it is difficult to satisfy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Characteristic point matching method based on relativity measurement
  • Characteristic point matching method based on relativity measurement
  • Characteristic point matching method based on relativity measurement

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0069] Example 1 is the matching result of two Buddha statue scenes, such as Figure 5 As shown, from the matching results of this pair of Buddha images, the two images have a relatively large relative rotation, that is, when the image is taken, it is obtained by rotating the camera, and the NNDR criterion is used in the matching. According to experience, the value of NNDR is 0.75. After removing the candidate matching pairs whose NNDR value of the matching point pair is greater than 0.75, 214 matching pairs are obtained, the wrong matching pair is 0, and the matching accuracy rate is 100%.

example 2

[0070] The matching result of the rocky scene in Example 2, such as Figure 6 As shown, from this pair of rock scene images, the two images have a relatively large change in perspective, the matching criterion adopts the NNDR criterion, the NNDR value is 0.75, the number of matching pairs is 449, the number of wrong matching pairs is 4, and the matching accuracy rate is 98.89%.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a feature point matching method based on correlation measurement, which comprises: shooting multiple scene images to be matched and inputting the scene images into a computer; calculating gradient of each pixel of the images, and extracting feature point information of the images; for each extracted feature point, dividing a round neighborhood into blocks by taking the feature point as a center, and calculating a gradient mean of each subarea obtained by the dividing step; utilizing the gradient of each pixel point of the subarea and the gradient mean of the subarea to establish a Harris correlation matrix of the subarea and calculate a determinant and a track of the Harris correlation matrix; utilizing the determinant and the track of the Harris correlation matrix to construct a Harris correlation measurement, and utilizing the Harris correlation measurement to construct a Harris correlation descriptor; and calculating Euclidean distance among the descriptors of the feature points, and applying measurement criteria to carry out matching. The method does not need to calibrate parameters of the pickup camera, does not need human participation during matching, automatically finishes matching, and has the characteristics of simplicity, practicality, dense matching points, high matching precision, good robustness and the like.

Description

technical field [0001] The invention belongs to the technical field of computer-aided automatic image processing, and in particular relates to a matching technology of image features. Background technique [0002] In our daily life and many fields, it is essential to process electronic images, especially to extract and retrieve specific information in images, such as image retrieval, object segmentation and recognition, 3D reconstruction, and augmented reality. It involves the matching of image features, especially the matching of feature points. The traditional method is based on the gray value of image pixels. Although this method is simple and easy to implement, the matching effect is not ideal due to the insufficient information used. [0003] In recent years, with the development and application of computer vision, people have begun to explore new methods, among which there are more typical methods based on image gradients and methods based on image differentiation. I...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06T7/33
Inventor 王旭光吴福朝胡占义
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products