Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Projective transformation image matching method based on transformation-invariant low-rank texture

A technology of projective transformation and matching method, applied in the field of projective transformation image matching, can solve the problem of inability to complete projective transformation image matching, etc., and achieves improvement of mismatched point pairs, high feature point repetition rate and correct matching rate, and high correct matching rate. Effect

Active Publication Date: 2019-03-26
XIDIAN UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to address the shortcomings of the above-mentioned prior art that cannot complete projection transformation image matching, and propose a projection transformation image matching method based on transformation invariant low-rank texture, which eliminates projection distortion of the input image through TILT transformation, and projects The matching problem of transformed image is transformed into the image matching problem of similar transformation to obtain more accurate matching point pairs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Projective transformation image matching method based on transformation-invariant low-rank texture
  • Projective transformation image matching method based on transformation-invariant low-rank texture
  • Projective transformation image matching method based on transformation-invariant low-rank texture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] specific implementation

[0031] The present invention will be further described below in conjunction with the accompanying drawings.

[0032] Refer to attached figure 1 , the implementation steps of the present invention are as follows:

[0033] Step 1, input the reference image and the image to be matched.

[0034] Input two images with projective transformation taken from two different perspectives, one as the reference image A, and the other as the image to be matched B.

[0035] Step 2, perform low-rank texture region detection on the two input images respectively, and obtain the low-rank texture region U in the reference image A A and the low-rank texture region U in the image B to be matched B .

[0036] 2a) Rotate the reference image A and the image to be matched B respectively These three different angles, get three sets of images under different rotation angles

[0037] 2b) For the rotated reference image Carry out Canny edge detection and Hough tr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a projection transform image matching method based on a transform invariant low-rank texture, and mainly solves the defects in the prior art that projection transform image matching cannot be completed. The scheme is that 1. two images containing projection transform are inputted and automatic detection and extraction of low-rank texture areas are respectively performed; 2. TILT transform is performed on the low-rank texture areas obtained through detection so that respective local transform matrixes are obtained and the inputted two images are corrected by using the local transform matrixes; 3. feature point detection is performed on the two corrected images, and scale-invariant feature descriptors and geometric shape descriptors are established for feature points; and 4. new feature descriptors are established through combination of the scale-invariant feature descriptors and the geometric shape descriptors, and similarity measurement is performed on the new descriptors by using Euclidean distance so that image matching is completed. The feature points with relatively high repetitive rate and correct matching rate can be extracted so that computational efficiency can be enhanced, and the method can be applied to image fusion, image splicing and three-dimensional reconstruction.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a projective transformation image matching method, which can be applied to the fields of target recognition and tracking, image splicing and three-dimensional reconstruction. Background technique [0002] In the fields of target recognition, image stitching and 3D reconstruction, it is necessary to match multiple views of the same scene. In general, feature-based image matching methods can be used for image matching, mainly because some image features are invariant to image scale, rotation and affine transformation, and only using feature information to find the geometric relationship between images has The advantages of high computational efficiency. However, when there is a large degree of projection distortion between the two images, it is often difficult to extract features with projection invariance in the existing technology, resulting in insufficient ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T3/00
CPCG06T3/0068
Inventor 张强李亚军朱韵茹相朋王龙
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products