Visual odometer realization method based on fusion of RGB and depth information

A visual odometry and implementation method technology, applied in computing, image data processing, instruments, etc., to achieve the effect of broadening application time and space, breaking the dependence of lighting conditions, and accurate and reliable motion estimation results

Inactive Publication Date: 2016-09-14
CHINA UNIV OF MINING & TECH
View PDF6 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The present invention is proposed in view of the above-mentioned problems, and the purpose is to provide a method for implementing a visual od

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual odometer realization method based on fusion of RGB and depth information
  • Visual odometer realization method based on fusion of RGB and depth information
  • Visual odometer realization method based on fusion of RGB and depth information

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0023] The present invention will be described in further detail below with reference to the drawings and embodiments:

[0024] Such as figure 1 As shown, a method for implementing a visual odometer that combines RGB and Depth information includes the following steps:

[0025] 1) Take time T as the cycle, use Kinect sensor to collect environmental information, and output sequenced RGB image and Depth image;

[0026] 2) According to the order of the time axis, select RGB images in turn And Depth image Depth image Point cloud image converted to 3D pcd format

[0027] 3) For the selected RGB image Carry out brightness, color shift and blur detection to determine the image quality β. Calculate the brightness parameter, color shift parameter and blur degree parameter. If the brightness parameter L=1, the color shift parameter C=1, and the blur degree parameter F=1, the RGB image quality is good, β=1; otherwise the RGB image quality is poor , Β=0;

[0028] 4) According to the judgment ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a visual odometer realization method based on fusion of RGB and depth information. According to the existing visual odometer realization method, only environmental RGB information can be obtained based on a monocular or binocular camera and three-dimensional information of the environment can not be obtained directly, so that the application environment of the method is limited and the precision is high. Besides, an RGB-image-matching-based positioning method has characteristics of mature technology and fast processing speed; and a Depth-image-matching positioning method has the characteristic of high environment changing robustness. According to the invention, with combination of the advantages of the RGB-image-matching-based positioning method and the Depth-image-matching positioning method, an RGB-D sensor is used for obtaining RGB and Depth information of a scene simultaneously and a visual odometer realization method based on fusion of RGB and depth information is provided. Information of 2D and 3D modes is utilized reasonably; the dependence on the illumination condition by the visual system is broken; precision, robustness, and practicability of the speedometer system can be improved substantially; and the application time and space for the mobile robot can be expanded.

Description

technical field [0001] The invention belongs to the field of autonomous navigation and positioning of mobile robots, and in particular relates to a method for realizing a visual odometer fused with RGB and Depth information. Background technique [0002] The odometer plays a vital role in the process of robot navigation and positioning. Visual odometer is a method that relies on visual information to measure the distance and direction of the robot's movement. It solves the cumulative error caused by the driving wheel idling or slipping and the measurement error caused by inertial navigation drift. It relies on visual input information and has a large amount of information. The power consumption is small, and no prior information of the scene and motion is required, which is an effective supplement to the traditional method. [0003] At present, visual odometry mainly relies on the image sequence obtained by monocular or binocular cameras, and obtains the motion parameters o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00G06T7/20G06T7/40
CPCG06T2207/30241
Inventor 缪燕子许红盛金慧杰金鑫卜淑萍李晓东周笛
Owner CHINA UNIV OF MINING & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products