Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image matching method based on self-attention deep neural network

A technology of deep neural network and matching method, which is applied in the field of image matching based on self-attention deep neural network, which can solve the problems of ignoring spatial information, inability to express complex models, non-parametric methods to mine local information, and inability to work effectively.

Active Publication Date: 2021-01-29
MINJIANG UNIV
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods have two fundamental drawbacks: 1) they (parametric-based methods) cannot work effectively when the ratio of outliers to the total matching is low; 2) they cannot express complex models with non-parametric methods to mine Local information for corresponding selection of feature pairs
The network model of MooYi et al. uses context normalization to capture global context information and embed context information into nodes, but because context normalization is easily affected by other matching pairs, the robustness is not high
Although the method based on deep learning has been able to achieve good results on various data sets, the spatial information corresponding to the local feature points in the network layer is often ignored. Making good use of local and global relationships will effectively improve the model matching accuracy. Therefore, How to make good use of the relationship between global context and local context is an important and challenging task

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image matching method based on self-attention deep neural network
  • Image matching method based on self-attention deep neural network
  • Image matching method based on self-attention deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0045] It should be pointed out that the following detailed description is exemplary and intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0046] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an image matching method based on a self-attention deep neural network. The method comprises the following steps: firstly, constructing a data set and carrying out feature enhancement on data in the data set; for input features, firstly using a Point CN module for extracting initial global features, downsampling through a differentiable pooling layer, then transmitting theinitial global features to a sequential awareness network to better learn global information, and then transmitting the initial global features to a differentiable unpooling layer to carry out upsampling to obtain better global information; and transmitting the initial global features extracted by the PointCN module to a self-attention layer for operation, so that enhanced feature information canbe obtained; carrying out splicing processing on the obtained better global information, and obtaining a preliminary prediction result through the PointCN module; and calculating the preliminary prediction result through a weighted 8-point algorithm to obtain an essential matrix. The network can replace RANSAC (Random Sample Consensus) to carry out post-processing on matching points extracted bySIFT (Scale Invariant Feature Transform) algorithms so as to improve the matching precision.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to an image matching method based on a self-attention deep neural network. Background technique [0002] Image matching is an important research field in computer vision. It is widely used in preprocessing in many fields, such as target recognition, target tracking, super-resolution image reconstruction, visual navigation, image stitching, 3D reconstruction, visual positioning, scene depth calculation, etc. It mainly consists of two parts, constructing initial matching pairs and removing false matches. [0003] Among the existing image matching methods, we can divide them into parametric methods, non-parametric methods and learning-based methods. Parameter-based methods are popular strategies for solving matching problems, such as RANSAC and its variants: PROSAC and USAC. Specifically, it first randomizes the minimum subset sampling of the dataset, generates a homography ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/46
CPCG06V10/462G06F18/22G06F18/2411
Inventor 肖国宝陈顺兴钟振陈煜楷
Owner MINJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products