[0026] Example 1
[0027] figure 1 A flowchart of the remote sensing image matching method according to the embodiment of the present invention is shown, such as figure 1 As shown, the method may include the following steps:
[0028] S101: Correct the remote sensing image to be matched based on the reference remote sensing image, so as to eliminate the geometric deformation, scale and rotation difference between the remote sensing image to be matched and the reference remote sensing image.
[0029] In the embodiment of the present invention, the reference image refers to an image of two or more images to be matched whose shooting angle of view is closest to the shooting angle of view perpendicular to the ground to be photographed.
[0030] In the embodiment of the present invention, the remote sensing image may be a remote sensing image captured by a low-altitude aircraft such as an unmanned aerial vehicle. Correspondingly, according to the external orientation elements of the image recorded in the POS system of the unmanned aerial vehicle, based on the reference remote sensing image, the remote sensing image to be matched can be processed. Correcting, or correcting the remote sensing image to be matched based on the reference remote sensing image based on the files that record the angle, position and other information of the aircraft in flight in other aircraft, the specific correction methods belong to the existing technology and will not be repeated here.
[0031] S102: Determine a matching area between the remote sensing image to be matched and the reference remote sensing image.
[0032] In the embodiment of the present invention, the matching area refers to the overlapping area between the remote sensing image to be matched and the reference remote sensing image.
[0033]In the embodiment of the present invention, the calculation of the overlapping area is still performed according to the position information (including latitude and longitude information and altitude information, etc.) of the remote sensing image to be matched and the reference remote sensing image captured by the aircraft, and the shooting angle (including rotation angle and pitch angle, etc.) Calculated, specifically, the center point distance information of the remote sensing image to be matched and the reference remote sensing image can be obtained according to the latitude and longitude information, the rotation angle between the remote sensing image to be matched and the reference remote sensing image can be determined according to the rotation angle information, and the pitch angle information and The height information is calculated to obtain the size between the two pictures, thereby calculating the overlapping area between the remote sensing image to be matched and the reference remote sensing image. Of course, in the calculation process of the above-mentioned overlapping area, the relevant information of the remote sensing image to be matched is corrected in step S101. The information of the remote sensing image to be matched shall prevail.
[0034] S103 : Use the SIFT algorithm to perform feature extraction on the matching area in the remote sensing image to be matched and the matching area in the reference remote sensing image, and use the point greater than the threshold of the extreme point among the extracted extreme points as the extracted feature points.
[0035] In the embodiment of the present invention, since the flight altitude, the shooting angle (roll angle) and the pitch angle of the low-altitude aircraft also change continuously during the continuous flight movement and shooting process, the distance between the remote sensing image to be matched and the reference remote sensing image Therefore, in the embodiment of the present invention, the SIFT algorithm with the characteristics of constant rotation and scale invariance is used to extract the feature of the matching area in the remote sensing image to be matched and the matching area in the reference remote sensing image.
[0036] In the embodiment of the present invention, when the SIFT algorithm is used for feature extraction, the number of extracted feature points is generally large. Therefore, in order to reduce the number of finally extracted feature points, the extracted extreme value points are larger than the extreme value points The point of the value point threshold is used as the extracted feature point. Specifically, the extreme point threshold value can be set according to the size of the matching area in the remote sensing image to be matched and the matching area in the reference remote sensing image. When the above matching areas are larger ( When the number of extracted extreme points is generally large), the threshold of extreme points can also be set larger, for example, the threshold of extreme points can be set to 10 or 13, etc., so as to reduce extreme points to a greater extent The number of feature points finally extracted; when the above matching areas are all small (the number of extracted extreme points is generally small), the threshold of extreme points can also be set smaller, for example, the extreme value can be set The point threshold is set to 5 or 8, etc., so as to retain more extreme points as the extracted feature points and ensure the number of feature points used for feature point matching.
[0037] S104: Calculate feature matching point pairs in the reference remote sensing image and the remote sensing image to be matched based on the extracted features, and obtain an image matching result of the remote sensing image to be matched and the reference remote sensing image.
[0038] In the embodiment of the present invention, the benchmark can be obtained by calculating the 128-dimensional Euclidean distance between the extracted feature points in the matching region in the remote sensing image to be matched and the extracted feature points in the matching region in the reference remote sensing image. The feature matching point pair in the remote sensing image and the remote sensing image to be matched, specifically, the distance between the feature point extracted from the remote sensing image to be matched (assuming the first feature point) and the feature point extracted from each reference remote sensing image can be calculated. distance to obtain the closest feature point with the Euclidean distance of the first feature point (assumed to be the second feature point), and compare the closest distance with the preset distance threshold, when the closest distance is less than the preset distance threshold, Then determine the feature points (the first feature point and the second feature point) with the closest Euclidean distance as the feature matching point pair; it is also possible to calculate the feature point extracted from the remote sensing image to be matched (assuming it is the third feature point) The Euclidean distance between the feature points extracted from each benchmark remote sensing image, so as to obtain the feature point with the closest Euclidean distance to the third feature point (assumed to be the fourth feature point) and the feature point with the second closest Euclidean distance. The closest distance and the second closest distance are compared, and when the difference between the closest distance and the second closest distance is greater than the preset difference threshold, the feature points with the closest Euclidean distances (the third feature point and the fourth feature point) are determined. point) is a feature matching point pair; of course, the feature matching point pair can also be determined by a combination of the above two methods, which will not be repeated here.
[0039] In the embodiment of the present invention, the 128-dimensional Mahalanobis distance between the extracted feature points in the matching region in the remote sensing image to be matched and the extracted feature points in the matching region in the reference remote sensing image can also be calculated. method to obtain the feature matching point pairs in the reference remote sensing image and the remote sensing image to be matched, so that the specific method of obtaining the feature matching point pair based on the Mahalanobis distance calculation can refer to the specific method of obtaining the feature matching point pair based on the Euclidean distance calculation above. to understand, and will not be repeated here.
[0040] In the embodiment of the present invention, by correcting the remote sensing image to be matched based on the reference remote sensing image, so as to eliminate the geometric deformation, scale and rotation difference between the remote sensing image to be matched and the reference remote sensing image, the low-altitude aircraft image can be eliminated due to the rotation of the aircraft. The influence of declination angle, flight height and pitch angle on the image of the aircraft; and because the aircraft is in a state of constant flight and movement, the object of the aircraft is constantly changing. When performing image matching, only the reference remote sensing image has matching significance. and the overlapping area between the remote sensing image to be matched and the remote sensing image to be matched, therefore, the matching area between the remote sensing image to be matched and the reference remote sensing image is only the overlapping area, so that it is possible to exclude the need for feature extraction and feature extraction in the reference remote sensing image and the remote sensing image to be matched. The feature matching area improves the matching efficiency of the remote sensing image matching method; also because the aircraft is in a state of constant flight and movement, the flight height, roll angle and pitch angle of the aircraft change greatly, resulting in large scale differences between images, so use The SIFT algorithm performs feature extraction on the matching area in the remote sensing image to be matched and the matching area in the reference remote sensing image, which can improve the accuracy of feature point extraction, thereby improving the accuracy of the final image matching result.
[0041] At the same time, by using the extracted extremum points that are larger than the extremum point threshold as the extracted feature points, the number of feature points extracted by using the SIFT algorithm can be reduced, and the subsequent feature point matching can be reduced. The time overhead is increased, and the matching efficiency of the remote sensing image matching method is improved.
[0042] As an optional implementation manner of this embodiment, such as figure 2 As shown, step S102 may include the following steps:
[0043] S201: Calculate the overlapping area of the reference remote sensing image and the remote sensing image to be matched according to the shooting angle and shooting position of the reference remote sensing image and the remote sensing image to be matched.
[0044] In this embodiment of the present invention, the specific content of this step S201 can be understood with reference to the specific content of the foregoing step S102.
[0045] S202: Extract the minimum rectangular area including the overlapping area in the reference remote sensing image and the remote sensing image to be matched, respectively, to obtain the matching area in the reference remote sensing image and the matching area in the remote sensing image to be matched.
[0046] In this embodiment of the present invention, since the overlapping area obtained in step S201 is more likely to be a non-rectangular area, and in image processing, the most common and easiest to process image is a rectangular image, therefore, by performing this step S202, making the finally obtained matching area a rectangle, specifically, such as image 3 As shown, if the overlapping area of the reference remote sensing image obtained in step S201 and the remote sensing image to be matched is the area S, the matching area in the reference remote sensing image is the rectangular area S1, and the matching area in the remote sensing image to be matched is the rectangular area S2.
[0047] As an optional implementation of this embodiment, the embodiment of the present invention provides another specific implementation of the remote sensing image matching method, such as Figure 4 shown, including the following steps:
[0048] S401: Correcting the remote sensing image to be matched based on the reference remote sensing image, so as to eliminate differences in geometric deformation, scale and rotation between the remote sensing image to be matched and the reference remote sensing image.
[0049] S402: Determine a matching area between the remote sensing image to be matched and the reference remote sensing image.
[0050] In the embodiment of the present invention, the matching area refers to the overlapping area between the remote sensing image to be matched and the reference remote sensing image.
[0051] S403: Divide the matching area in the remote sensing image to be matched and the matching area in the reference remote sensing image into a plurality of matching blocks respectively.
[0052] In the embodiment of the present invention, the number of matching blocks into which the matching region in the remote sensing image to be matched and the matching region in the reference remote sensing image are respectively divided may be based on the size of the corresponding matching region and the number of matching blocks used to perform the matching in the embodiment of the present invention. It is determined by the computing capability of the computing device of the remote sensing image matching method, and no limitation is made here.
[0053] S404: Use the SIFT algorithm to perform feature extraction on each matching block in turn, and use a point greater than the threshold of the extreme value point among the extracted extreme value points as the extracted feature point.
[0054] S405: Calculate feature matching point pairs in the reference remote sensing image and the remote sensing image to be matched based on the extracted features, and obtain an image matching result of the remote sensing image to be matched and the reference remote sensing image.
[0055] The specific content of the embodiment of the present invention may be understood with reference to the specific content of steps S101-S104.
[0056] In the embodiment of the present invention, after the matching area between the remote sensing image to be matched and the reference remote sensing image is determined, the matching area in the remote sensing image to be matched and the matching area in the reference remote sensing image are respectively divided into multiple matching blocks, It can be divided into blocks according to the computing power of the computing device that executes the remote sensing image matching method, so that the feature extraction of each block in the matching area can be performed faster, so that the feature extraction efficiency of the entire matching area can also be improved; The performance requirements of the remote sensing image matching method on its executing device can be reduced.