Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Matching method of feature points in robot vision navigation positioning image

A robot vision, image feature point technology, applied in image analysis, image data processing, graphic image conversion and other directions, can solve the problems of lack of height information, large interference in similar small areas, easy to cause mismatch, etc., to expand the screening range. Effect

Active Publication Date: 2017-08-29
INST OF ELECTRONICS CHINESE ACAD OF SCI
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For a robot moving in a building, the feature points on the ceiling are usually the most effective navigation feature points. As long as the feature points in the robot vision are matched in the map and the corresponding points of these points in the map are found, the It is convenient to calculate the position and heading of the robot on the map, but there are still the following technical problems to be solved urgently: the image collected is a two-dimensional image, lacking height information, and it is difficult to distinguish ceiling feature points from feature point information at other height positions; In addition, most of the current feature point extraction and matching algorithms are based on the color distribution in the neighborhood of the matching points, and there are many similar indoor scenes. Similar small areas cause great interference to feature point matching, which is easy to cause mismatching

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Matching method of feature points in robot vision navigation positioning image
  • Matching method of feature points in robot vision navigation positioning image
  • Matching method of feature points in robot vision navigation positioning image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The invention provides a matching method for robot visual navigation and positioning image feature points. By utilizing the characteristics of affine transformation invariance of captured image feature points between different frames, according to the connection length of feature pairs in two frames of images , the included angles are equal to screen out the mismatching feature points, to achieve accurate matching of valid feature points, and use the threshold judgment method to determine whether the length of the connection line and the included angle are equal, which broadens the screening range of effective feature points, and is also suitable for Capture blurry, pixelated scenes.

[0027] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0028] The working principle ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a matching method of feature points in a robot vision navigation positioning image. The method comprises steps of extracting feature points from two frame images and performing rough matching of the feature points of the two frame images; and filtering the matching feature points using affine transformation invariance of spatial geometry. By using the feature that the captured ceiling image feature points have the affine transformation invariance between different frames, the mismatched feature points can be filtered according to whether the connection length of the feature pair in the two frame images and whether the corresponding angles are equal. The feature points on the ceiling can be screened, the accurate matching is realized, and the mismatched interference points can be removed. In addition, when the matching of the feature points of different frame images is made, that whether the length and the angle are equal are judged by using the threshold determination method, and the screening range of the effective feature points is enlarged, which is not limited to the ceiling feature points. The matching method is also applicable to the positioning in fuzzy, pixel discretization and other conditions of shooting.

Description

technical field [0001] The invention belongs to the field of machine vision positioning, and relates to a matching method for robot vision navigation and positioning image feature points. Background technique [0002] Mobile robot is a comprehensive system integrating environmental perception, dynamic decision-making and planning, behavior control and execution, etc. It is widely used in military, industrial and civilian fields. The positioning of mobile robots is an important basis for behavior control and execution, and environmental perception. At present, common positioning methods for robots are mainly divided into five categories: ultrasonic navigation and positioning, visual navigation and positioning, GPS global positioning, light emission navigation and positioning, and real-time positioning. And map construction (Simultaneous Localization and Mapping, SLAM). [0003] In the robot vision navigation and positioning technology, accurate autonomous positioning is the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00G06T7/30G06T7/60G06T7/73
CPCG06T7/30G06T7/60G06T7/73G06T3/02
Inventor 曹天扬蔡浩原李彤方东明王金戈刘昶
Owner INST OF ELECTRONICS CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products