Heterogeneous twin region selection network and image matching method based on same

An area matching and twin network technology, applied in the field of computer vision, can solve the problems that the image matching algorithm cannot adapt to the imaging viewpoint and scale change, the scene adaptability and anti-interference ability are weak, and the matching success rate is low, so as to improve the anti-interference ability , improve the matching success rate and reduce labor costs

Active Publication Date: 2019-09-17
HUAZHONG UNIV OF SCI & TECH
View PDF5 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the defects of the prior art, the purpose of the present invention is to solve the technical problems that the prior art image matching algorithm is basically unable to adapt to the imaging viewpoint and scale change, the scene adaptability and anti-interference ability are weak, and the matching success rate is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Heterogeneous twin region selection network and image matching method based on same
  • Heterogeneous twin region selection network and image matching method based on same
  • Heterogeneous twin region selection network and image matching method based on same

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0051] sample generation

[0052] (1) Prepare n different scenarios {P 1 ,...,P i ...,P n}, the same scene P i Contains several visible light images {P i1 ,...,P ij ,…P iM}, P ij i.e. scene P i The jth picture of .

[0053] (2) For each image of the same scene, manually select the area in the scene {P ij1 ,...,P ijk ,…P ijs}, P ijk That is, scene P i The k-th area of ​​the j-th image. In different images of the same scene, the size, brightness, and angle of the region are different, and there is a certain deformation in the region.

[0054] (3) In the same scene P i Several visible...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a heterogeneous twin region selection network and an image matching method based on the same, and belongs to the field of computer vision. The system comprises a heterogeneous twin network and a regional matching network which are sequentially connected in series, wherein the heterogeneous twin network is used for extracting a feature map of a template map and a feature map of a to-be-matched map, the region matching network is used for obtaining a region matching result according to the feature map of the template map and the feature map of the to-be-matched map, the heterogeneous twin network comprises a sub-network A and a sub-network B which are connected in parallel, each sub-network comprises a feature extraction module, a feature fusion module and a maximum pooling module which are sequentially connected in series, the two sub-networks are the same in modules, and only convolution kernels of first-layer convolution of the feature extraction modules are different. The heterogeneous twin region selection network is applied to image matching, non-fixed-scale template and to-be-matched image input is achieved, image multi-layer characteristics are fully utilized, the performance of the matching method is effectively improved, and the matching success rate and speed are increased.

Description

technical field [0001] The invention belongs to the field of computer vision, and more specifically relates to a heterogeneous twin area selection network and an image matching method based on the network. Background technique [0002] Image matching refers to the process of finding images or image regions (sub-images) similar to a given scene region image in one (or a batch) of images. Usually, the image of the known scene area is called a template image, and the subimage that may correspond to it in the image to be searched is called the scene area image to be matched of the template. Image matching is to find correspondence between two or more images of the same scene from different times or different perspectives. Specific applications include target or scene recognition, solving 3D structures in multiple images, stereo correspondence, and motion tracking. [0003] At present, most image matching algorithms only use shallow artificial features, such as grayscale feature...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06N3/04G06N3/08
CPCG06N3/08G06V10/40G06N3/045G06F18/24
Inventor 杨卫东蒋哲兴王祯瑞姜昊王公炎
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products