Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

View-angle-invariant local region constraint-based slanted image linear feature matching method

A technology of straight line features and local areas, which is applied to computer parts, instruments, characters and pattern recognition, etc., and can solve problems such as difficult matching of straight line feature intersections

Active Publication Date: 2017-08-08
SOUTHWEST JIAOTONG UNIV
View PDF2 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, for oblique images with large viewing angle changes, the matching between line feature intersections is still a difficult problem, which limits the application of such methods in line feature matching of oblique images.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • View-angle-invariant local region constraint-based slanted image linear feature matching method
  • View-angle-invariant local region constraint-based slanted image linear feature matching method
  • View-angle-invariant local region constraint-based slanted image linear feature matching method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The present invention will be described in detail below in conjunction with the accompanying drawings.

[0070] Such as Figure 7 As shown in , a linear feature matching method of oblique images constrained by a view-invariant local area, the method includes the following steps in turn:

[0071] Step 1: Extract straight line features from the reference image and the image to be matched, and calculate the feature salience of each straight line feature according to formula (1):

[0072]

[0073] In formula (1), saliency represents the saliency value of the straight line feature, l represents the length of the straight line feature, Indicates the mean value of the gradient magnitudes of all pixels on the line feature, and a and b represent weight coefficients, which are used to control the relative importance of the line feature length and gradient amplitude mean to the calculation of feature salience. During specific implementation, parameters a and b may take empir...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a view-angle-invariant local region constraint-based slanted image linear feature matching method. The method sequentially comprises the following steps of respectively extracting the linear features of a reference image and a to-be-matched image, and calculating the feature saliency of each linear feature; based on the polynomial mapping function, constructing a view-angle-invariant local region, and calculating the feature region of each linear feature based on the view-angle-invariant local region; for each linear feature, calculating the phase consistency value and the direction in the feature region, and constructing a phase consistency feature descriptor for each linear feature; according to the feature saliency of each linear feature, adopting linear features which are ranked to be the front t% in feature saliency as salient linear features, while adopting all the rest linear features as non-salient linear features; conducting the salient linear feature matching; classifying salient linear features, which are not matched successfully, as non-salient linear features; respectively adopting successfully matched salient linear features in the reference image and in the to-be-matched image as clustering centers, and clustering non-salient linear features into the type of salient linear features; conducting the non exhaustive search method and performing the non-salient linear feature matching.

Description

technical field [0001] The invention relates to the related technical field of image matching in remote sensing image processing, in particular to a straight line feature matching method of an oblique image with constant local area constraints of an angle of view. Background technique [0002] Oblique photogrammetry can simultaneously obtain high-resolution images of the top surface and facade of ground objects, and has been widely used in automatic reconstruction and texture mapping of 3D models of urban buildings, urban planning and monitoring, emergency response, cadastral data verification and update, etc. in the field. Image matching is one of the key scientific issues in oblique photogrammetry data processing. It is the foundation and core of image registration, stitching, 3D reconstruction, target detection and tracking, etc. It has important application value in military and civilian fields. Compared with traditional remote sensing images, there is a large degree of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46
CPCG06V20/13G06V10/44G06V10/462
Inventor 陈敏严少华朱庆
Owner SOUTHWEST JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products