Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot vision positioning method based on feature point detection and mismatching screening

A technology of feature point detection and robot vision, which is applied in the directions of instruments, image analysis, image enhancement, etc., can solve problems affecting the calculation speed of algorithms, achieve the effects of improving detection speed, facilitating pose calculation, and ensuring matching accuracy

Inactive Publication Date: 2021-03-26
ANHUI UNIVERSITY
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the image feature extraction of this scheme uses SURF feature points with a large amount of calculation, which affects the calculation speed of the algorithm, and the RANSAC algorithm treats all matching point pairs equally, and uniformly draws random samples from the entire set; the data of the RANSAC algorithm building model has random sex, uncertainty

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot vision positioning method based on feature point detection and mismatching screening
  • Robot vision positioning method based on feature point detection and mismatching screening
  • Robot vision positioning method based on feature point detection and mismatching screening

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] In order to make the purpose, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention are clearly and completely described below in conjunction with specific embodiments and with reference to the accompanying drawings. Obviously, the described embodiments are part of the implementation of the present invention. example, not all examples. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0073] Such as figure 1 As shown, this embodiment provides a robot vision positioning method based on feature point detection and mismatch screening, including the following steps:

[0074] Step A: Construct a Gaussian image pyramid for the reference image and the image to be matched respectively;

[0075] The method of constructing a Gaussian pyramid is as follows: us...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a robot vision positioning method based on feature point detection and mismatching screening. The robot vision positioning method comprises the following steps: A, constructinga Gaussian image pyramid; B, performing FAST feature point detection on each layer of image by using a quadrilateral model; C, constructing a BRIEF descriptor for the extracted feature points based onneighborhood pixel values; D, conducting violence matching based on the BRIEF descriptor; and E, screening mismatched feature points, and corresponding the matched feature points to an original reference image and a to-be-matched image based on a screening result. The method has the advantages that the feature points are detected in a quadrilateral mode, the detection speed is improved, the defect that the FAST feature points do not have scales is eliminated by constructing a Gaussian image pyramid, and based on violent matching, the feature points can be matched with each other, then the mismatched feature points are screened, and a correct matching result is reserved, so that pose calculation is facilitated.

Description

technical field [0001] The invention relates to the technical field of image detection, in particular to a robot vision positioning method based on feature point detection and mismatch screening. Background technique [0002] In the field of image detection of computer vision, image feature extraction and matching are important links in robot vision positioning. Robot vision positioning is often used in simultaneous positioning and mapping (Simultaneous Localization and Mapping, SLAM), which refers to the mobile carrier equipped with specific sensors, such as Depth cameras, lidars, and inertial measurement units. In an unknown environment, the carrier starts from a certain point. During the movement, the image and depth information collected by the sensor are repeatedly used to estimate its own trajectory and build incrementally. Map information of the surrounding environment. The SLAM system includes visual odometry, back-end optimization, loop detection, building Figure ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T7/246G06T5/00G06T5/20
CPCG06T7/74G06T7/248G06T5/20G06T2207/20024G06T2207/20016G06T2207/30244G06T5/70
Inventor 樊渊郭予超李腾董翔宋程
Owner ANHUI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products