Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

SAR image and optical image matching method based on feature matching and position matching

An optical image and feature matching technology, applied in the field of image processing, to achieve the best matching ability, best matching accuracy, and high practical value.

Inactive Publication Date: 2022-05-17
云南览易网络科技有限责任公司
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, learning-based feature descriptors also face many difficulties

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • SAR image and optical image matching method based on feature matching and position matching
  • SAR image and optical image matching method based on feature matching and position matching
  • SAR image and optical image matching method based on feature matching and position matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0165] The present embodiment discloses a kind of SAR image and optical image matching method (MatchosNet) and has realized position matching, and it comprises the following steps:

[0166] A. Design a two-dimensional Gaussian distribution voting algorithm to achieve position matching between SAR images and optical images.

[0167] The coordinates of the upper left pixel of the SAR image on the optical image can be obtained by matching each pair of feature points. Such as Figure 10 As shown, since each set of images has many different feature matching points, it is possible to obtain multiple candidate location coordinates.

[0168] A location matching voting algorithm is designed using two-dimensional Gaussian distribution. x, y are respectively the horizontal coordinate and the vertical coordinate of the prediction point, and the two-dimensional Gaussian function of the present invention can be expressed as:

[0169]

[0170] Among them, σ 1 , σ 2 is the variance of...

Embodiment 3

[0182] In this embodiment, different deep convolutional network structures are used to experiment with the feature matching and position matching methods in Embodiment 1 and Embodiment 2, so as to verify the performance of the network structure designed in the present invention.

[0183] Table 8 The average error xrmse, yrmse, xyrmse and the number of pictures matched by the correct position of the method using different depth convolutional networks.

[0184]

[0185] By comparing the results of different models, the influence of the network structure on the feature detection model and the validity of the MatchosNet structure designed by the present invention are verified. Table 8 shows the position matching results of MatchosNet and the other two methods. It can be seen that MatchosNet has the lowest Xrmse, Yrmse, and XYrmse, and the number of images with correct position matches in the batch is the largest. The above experiments prove that the network architecture design...

Embodiment 4

[0187] In this embodiment, different loss functions are used to conduct experiments on the feature matching and position matching methods in Embodiment 1 and Embodiment 2, so as to verify the performance of the loss function designed in the present invention.

[0188] Table 9 The average error xrmse, yrmse, xyrmse and the number of pictures matched by the correct position of the method using different loss functions.

[0189]

[0190] The effect of the loss function on the feature detection model is verified by comparing the results of different models. Table 9 shows the position matching results of MatchosNet and the other two methods. It can be seen that MatchosNet has the lowest Xrmse, Yrmse, and XYrmse, and the number of images with correct position matches in the batch is the largest. The above experiments prove that the loss function designed by MatchosNet of the present invention is very effective when dealing with feature matching and position matching problems.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an SAR image and optical image matching method based on feature matching and position matching. The SAR image and optical image matching method comprises the following steps: performing preliminary key point detection on an optical image and an SAR image by using a Gaussian difference algorithm; according to the detected key points of the optical image and the SAR image, extracting a surrounding area, and reconstructing an image block; designing a deep convolutional neural network comprising a dense block and a transition layer, designing a composite loss function, and generating a deep feature descriptor by training and operating the deep convolutional neural network; performing feature matching on the optical image and the SAR image by using an L2 distance algorithm and a depth feature descriptor, and evaluating a distance error of a matching point; and realizing position matching of the SAR image and the optical image through a two-dimensional Gaussian function voting algorithm. The problem of feature matching of the SAR image and the optical image is solved, better matching capability and accuracy are achieved, and position matching of the SAR image and the optical image can be achieved.

Description

technical field [0001] The invention belongs to the technical field of image processing, in particular to a method for matching SAR images and optical images based on feature matching and position matching. Background technique [0002] In earth observation, optical and synthetic aperture radar (SAR) images can be compared and analyzed, and more valuable information can be obtained through complementarity. The feature matching of SAR images and optical images is very important in the fields of image registration, image fusion and change detection. However, since the imaging mechanisms of optical images and SAR images are very different, it is difficult to match the characteristics of optical images and SAR images. Moreover, speckle noise exists widely in SAR images, which will affect the performance of image features and make them difficult to identify. Furthermore, the distance dependence along the ranging axis and the properties of the radar signal wavelength lead to geo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06V20/10G06N3/04G06N3/08G06V10/46G06V10/764G06V10/75G06V10/82
CPCG06N3/04G06N3/08G06F18/241
Inventor 廖赟邸一得
Owner 云南览易网络科技有限责任公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products