GPU-SIFT based real time binocular vision positioning method

A binocular vision positioning and video technology, applied in navigation, instrumentation, surveying and navigation, etc., can solve the problems of time-consuming and difficult to achieve image matching, speed up the SIFT feature matching process, etc., to achieve scalability and practicability, The effect of strong environmental applicability and high positioning accuracy

Pending Publication Date: 2017-07-07
WUHAN UNIV +1
View PDF5 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, SIFT feature extraction and description consume a lot of time and it is difficult to achieve real-time image matching. GPU-SIFT, which uses GPU to accelerate the process of SIFT feature extraction, description and matching, can greatly accelerate the SIFT feature matching process and realize real-time SIFT feature matching.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • GPU-SIFT based real time binocular vision positioning method
  • GPU-SIFT based real time binocular vision positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0034] In this embodiment, the real-time binocular vision positioning method based on GPU-SIFT proposed by the present invention includes the following steps:

[0035] Step 1, using a parallel binocular camera to obtain a stereoscopic image video of the left and right eye images during the movement of the robot or mobile platform;

[0036] Step 2, using the GPU-SIFT feature matching algorithm to obtain the corresponding matching points in the two frames before and after the video shot during the motion;

[0037] Step 3: Estimate the displacement of the camera by solving the equation of motion by matching the coordinate changes of the points in the imaging space or establishing three-dimensional coordinates;

[0038] Step 4: After obtaining the position and rotation angle of the camera at each moment of travel, combined with kalman filtering, th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a GPU-SIFT based real time binocular vision positioning method. The method comprises the following steps: 1) adopting a parallel binocular camera for acquiring a stereo image video of left and right eye images in a moving process of a robot or a mobile platform; 2) adopting feature point matching for acquiring corresponding matching points in front and rear frames of a shot video in a motion process; 3) solving a motion equation by changing the matching points in an imaging space coordinate or establishing a three-dimensional coordinate, thereby acquiring the displacement of the camera by estimation; 4) acquiring the position and rotation angle at each moment of the camera in advancing and combining with kalman filtering, thereby acquiring the advancing path of the camera in the whole process and realizing real-time binocular vision positioning on the robot or the mobile platform. According to the invention, GPU-SIFT is adopted for accelerating the SIFT feature matching process, the binocular vision positioning is adopted for realizing the real-time binocular vision positioning of the robot or the mobile platform, the higher positioning accuracy can be acquired and the extendibility, practicability and environment applicability are higher.

Description

[0001] Technical field: [0002] The invention relates to the technical field of robot vision positioning and navigation, in particular to a real-time binocular vision odometer system based on GPU-SIFT. [0003] Background technique: [0004] With the continuous development of robots and computer vision, cameras are increasingly used in the visual positioning and navigation of robots. Robot positioning mainly includes code disc, sonar, IMU, GPS, Beidou, laser scanner, RGBD camera and binocular camera for visual positioning. The stroke is deduced to achieve positioning, but this positioning method has a large error and inaccurate positioning in the case of sand, grass or wheel slippage. Sonar positioning mainly relies on ultrasonic sensor emission and return signal analysis to determine obstacles for positioning and navigation, but the resolution of sonar is low, and there are more noises in the signal, which easily interferes with positioning. Robots using IMU for positioning...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/00G01C21/20G06K9/46
CPCG01C21/00G01C21/20G06V10/462
Inventor 罗斌张云林国华刘军赵青王伟陈警张良培
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products