Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image matching method based on deep semantic alignment network model

A network model and matching method technology, which is applied in biological neural network models, neural learning methods, character and pattern recognition, etc., can solve the problems of time-consuming and labor-intensive labeling of image data with dense correspondence, and low accuracy, so as to improve image alignment Effect, high accuracy, effect of improving robustness

Active Publication Date: 2021-08-27
PEKING UNIV
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0017] In order to overcome the deficiencies of the prior art above, the present invention provides an image matching method based on a deep semantic alignment network model, by establishing an object position-aware semantic alignment network and using a triple sampling strategy to train the network to establish and optimize hierarchically The alignment relationship between images solves the technical problems that it is difficult to directly establish the dense corresponding relationship between images in the existing technology and the time-consuming and labor-intensive labeling of image data with low accuracy, and improves the accuracy of image matching

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image matching method based on deep semantic alignment network model
  • Image matching method based on deep semantic alignment network model
  • Image matching method based on deep semantic alignment network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0078] Below in conjunction with accompanying drawing, further describe the present invention through embodiment, but do not limit the scope of the present invention in any way.

[0079] The image matching method based on the deep semantic alignment network model proposed by the present invention is a deep neural network model OLASA based on semantic alignment. The input of the model method is the source image, the target image and the reference image. In-depth semantic analysis estimates the deformation parameters of the source image according to its internal alignment relationship. The deformed source image is the output result of this method, and the target objects contained in it can be matched to the corresponding objects in the target image. The internal implementation of OLASA is through the joint learning of three sub-networks: three sub-networks, potential object co-localization (POCL), affine transformation regression (ATR), two-way thin plate spline regression (TTPS)...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image matching method based on a deep semantic alignment network model, and the method comprises the steps of gradually estimating the alignment between two semantic similar images through building an object position perception semantic alignment network model OLASA; using a triple sampling strategy to train a network model OLASA, and estimating translation, affine transformation and spline transformation respectively through three sub-networks Ntran, Naffi and Nttps of potential object co-location POCL, affine transformation ATR and bidirectional thin-plate spline regression TTPS; and obtaining an image matching result by establishing and optimizing an alignment relationship between the images in a layering manner. By using the technical scheme provided by the invention, the image alignment effect with relatively large position difference can be improved, and the image matching accuracy is improved. The invention can be applied to target tracking, semantic segmentation, multi-view three-dimensional reconstruction and the like in the field of computer vision.

Description

technical field [0001] The invention belongs to the technical field of computer vision and digital image processing, and relates to image matching technology, in particular to a method for establishing an accurate corresponding matching relationship of main target objects in similar images based on an image depth semantic alignment network model. Background technique [0002] Image semantic alignment aims to establish an accurate correspondence between similar target objects between images, that is, a point-to-point feature matching relationship between similar target objects in different images. The specific scene refers to the use of image feature information to analyze and quantify the similarity between features under the premise of the same or similar image content information, and then determine the matching relationship of feature points on similar objects in the image. This problem is a basic problem in computer vision, and has a wide range of applications in the fie...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06N3/04G06N3/08
CPCG06N3/08G06V10/443G06V10/751G06V2201/07G06N3/045G06F18/22
Inventor 吕肖庆瞿经纬王天乐
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products