Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image feature point mismatching elimination method based on local transformation model

A technology of image feature and local transformation, applied in image analysis, image data processing, computer components, etc., can solve the problems of high mis-match rate and low accuracy of feature points, and achieve the effect of improving accuracy and accuracy

Active Publication Date: 2020-01-31
HARBIN INST OF TECH
View PDF8 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem of high mis-matching rate of feature points and low accuracy of existing methods for removing mis-matched feature points in the process of image registration based on feature point matching, and propose an image registration method based on local transformation model Feature Point Mis-match Elimination Method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image feature point mismatching elimination method based on local transformation model
  • Image feature point mismatching elimination method based on local transformation model
  • Image feature point mismatching elimination method based on local transformation model

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0034] Specific implementation mode 1: In this implementation mode, a method for eliminating incorrect matching of image feature points based on a local transformation model, the specific process is as follows:

[0035] Step 1. Use SIFT (Scale Invariant Feature Transform) feature point extraction algorithm to capture two images I of the same scene under different viewing angles. p and I q Detect and describe feature points;

[0036] Step 2, based on FLANN (Fast Library for Approximate Nearest Neighbors, fast nearest neighbor approximation search function library) fast nearest neighbor approximation search function library to image I p Each feature point of the image I q The feature points are searched intensively to get the image I p Each feature point in image I q The feature points of the nearest neighbor and the second nearest neighbor;

[0037] When image I p The feature points in image I q A times the distance of the nearest neighbor feature point in the image I p T...

specific Embodiment approach 2

[0050] Specific embodiment 2: The difference between this embodiment and specific embodiment 1 is that in the second step, the A times is 1.2 times.

[0051] Other steps and parameters are the same as those in Embodiment 1.

specific Embodiment approach 3

[0052] Specific embodiment three: the difference between this embodiment and specific embodiment one or two is that the RANSAC algorithm is used in the step three to classify and initially screen the set of initial feature points obtained in the step two; the specific process is:

[0053] Step 31. Set the remaining point pair set after preliminary screening as S′, and initialize it to the initial feature point pair set S obtained in step 2, and initialize the number of categories n to 0;

[0054] Step 32: Use the RANSAC (random sample consensus) algorithm to extract interior points from the point pair set S′, and the interior point feature point pair set is s n+1 , and update S′ to exclude s n+1 The set of remaining point pairs after

[0055] Step 3 and specific implementation mode 2 are a process of cyclically screening point pairs, each time step 3 and 2 are performed to extract some internal points, namely s n+1 , if the number of interior points is greater than or equal ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image feature point mismatching elimination method based on a local transformation model, and relates to an image feature point mismatching elimination method. The objectiveof the invention is to solve the problems of high mismatching rate of feature points and low accuracy of an existing mismatching feature point elimination method in an image registration process based on feature point matching. The method comprises the following steps of: 1, detecting and describing feature points of two images of the same scene shot at different visual angles; 2, obtaining an initial feature point pair set; 3, carrying out classification and preliminary screening on the initial feature point pair set; 4, calculating the Euclidean distances between each feature point in the feature point set and all feature points; 5, if the category number is the same as the category number of the feature point, determining that the point pair is an inner point, otherwise, carrying out the step 6 to judge whether the point pair is a mismatching point; and 6, if the error is greater than 10, determining the point pair as a mismatching point, removing the mismatching point from the point pair set, and obtaining a feature point pair set after removal. The method is applied to the field of image feature point matching.

Description

technical field [0001] The invention relates to a method for eliminating incorrect matching of image feature points. Background technique [0002] Image registration technology is a research hotspot in the field of image processing and computer vision, and it is used in image stitching, video surveillance, 3D reconstruction and other occasions. Feature-based image registration technology is widely used due to its high computational efficiency, and the feature point matching process is its key step. The accuracy of feature point matching determines the accuracy of the prediction of the transformation model between images, but matching feature points according to feature descriptions often results in mismatching. Therefore, it is very important to study a method for eliminating mismatched feature points. [0003] The current screening method for feature point matching is mainly the RANSAC algorithm. The algorithm uses multiple random sampling and calculates a global homogra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/33G06K9/62
CPCG06T7/33G06V10/757
Inventor 张智浩杨宪强高会军
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products