Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature point matching and screening method for adaptive region motion statistics

A feature point matching and screening method technology, applied in the field of computer vision, can solve the problem of low accuracy of the feature point matching and screening method, and achieve the effect that is conducive to calculation and accurate screening

Pending Publication Date: 2022-08-02
JIANGNAN UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the problem that the accuracy of the current feature point matching and screening method is low, the present invention provides a feature point matching and screening method for adaptive area motion statistics, including the following steps:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature point matching and screening method for adaptive region motion statistics
  • Feature point matching and screening method for adaptive region motion statistics
  • Feature point matching and screening method for adaptive region motion statistics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0054] This embodiment provides a feature point matching screening method for adaptive regional motion statistics, including the following steps:

[0055] Step 1: Match the pixel gray value relationship in the image by the feature point, and use the watershed algorithm to divide the image into several regions;

[0056] Step 2: For the area obtained in step 1, use the constraints of the maximum stable extreme value area to screen out the area with a small area change caused by the change of the pixel grayscale threshold;

[0057] Step 3: Remove the overlapped part in the area of ​​the maximum stable extreme value, and mark the area serial number;

[0058] Step 4: Count the matching number and corresponding relationship of feature points in each area;

[0059] Step 5: According to the constraints of regional motion statistics, screen out the correct region correspondence, and retain the feature point matching in it.

Embodiment 2

[0061] This embodiment provides a feature point matching screening method for adaptive regional motion statistics, including the following steps:

[0062] Step 1: Match the pixel gray value relationship in the image where the feature points are located, and use the watershed method [VincentL, Soille P.Watersheds in digital spaces:an efficient algorithm based on immersion simulations[J].IEEE Transactions on Pattern Analysis&MachineIntelligence,1991,13 (06):583-598.] Divide the image into regions under different thresholds. Specific steps are as follows:

[0063] Step 11: Initialize the data structure, including: a region stack, a history stack and a boundary heap, the region stack is a data structure for storing image region pixel information under different grayscale thresholds, and the history stack is a recording region stack. Threshold raising process The data structure, the boundary heap is the data structure of the storage area boundary pixel collection;

[0064] A mark...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a feature point matching and screening method for adaptive region motion statistics, and belongs to the technical field of computer vision subjects. According to the method, a maximum stable extremal region division method of a watershed thought is used, and a region merging step is supplemented, so that self-adaptive division of image regions is realized; according to the method, the matching number of corresponding feature points in each region in two images is counted, a region motion statistics constraint scheme is provided based on the core idea of grid motion statistics, and corresponding thresholds are set for regions with different matching numbers of feature points according to conditions. Through data comparison under different data sets, compared with a grid motion statistics (GMS) method, correct feature point matching can be screened out more accurately, and calculation of links after feature matching is more facilitated.

Description

technical field [0001] The invention relates to a feature point matching screening method for adaptive regional motion statistics, belonging to the field of computer vision. Background technique [0002] Adaptive regional motion statistics is to adaptively divide the image into disjoint regions through the gray value information of the image. After counting the number of matching feature points between regions, the constraint of the motion consistency of feature points in each region is used. Perform feature point matching screening. Feature points are pixels that can represent an image or object in an identical or at least very similar invariant form in other similar images containing the same scene or object. Feature point matching is to find the corresponding relationship between the feature points obtained in the two images according to their descriptors. There are many false matches in the initially obtained feature matching, and feature point matching must be screene...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V10/75G06V10/26
CPCG06V10/757G06V10/267
Inventor 王映辉南彬
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products