Positioning and navigation method and device based on lidar and binocular camera

A technology of laser radar and binocular camera, which is applied in the directions of surveying and navigation, navigation, measuring devices, etc., to achieve the effect of fast matching and high precision

Active Publication Date: 2019-10-29
SICHUAN UNIV
View PDF6 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In order to adapt to a wide variety of environments, a single sensor has become a bigger bottleneck for dedicated robots in terms of detection distance and detection accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Positioning and navigation method and device based on lidar and binocular camera
  • Positioning and navigation method and device based on lidar and binocular camera
  • Positioning and navigation method and device based on lidar and binocular camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The present invention will be further described below in conjunction with accompanying drawing:

[0025] Such as figure 1 Shown: The positioning and navigation method based on lidar 3 and binocular camera 2, including steps:

[0026] Obtain the camera image through the binocular stereo camera, process the image information from the binocular stereo camera to obtain the pose;

[0027] Obtain radar images through LiDAR 3, and process image information from LiDAR 3 to obtain poses;

[0028] The pose from the binocular stereo camera image and the pose from the lidar 3 are optimized and fused, and the dense point cloud model and sparse point cloud model of the environment are obtained for navigation and precise positioning respectively.

[0029] The image pose acquisition steps of the binocular stereo camera include:

[0030] 1. Extract feature points from the collected images obtained by the left and right cameras, and complete the matching of the feature points of the l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a positioning and navigation method based on a lidar and a binocular camera, comprising the steps of acquiring a camera image through a binocular stereo camera and processingthe image information from the binocular stereo camera to acquire a pose; acquiring a radar image through a lidar and processing the image information from the lidar to acquire a pose; optimizing andfusing the two poses to obtain a dense point cloud model and a sparse point cloud model with respect to the environment for navigation and precise positioning respectively. A positioning and navigation device based on the lidar and the binocular camera includes an image module, a lidar module, a pose optimization module, a binocular stereo camera, and a lidar. The device uses the pose optimizationmodule for optimizing and fusing the pose from the binocular stereo camera and the pose from the lidar, combines the binocular stereo camera with the multi-line lidar to achieve high-precision pose estimation and a high-precision dense point cloud map, and can be used directly for navigation and positioning of unmanned vehicles.

Description

technical field [0001] The invention belongs to the technical field of positioning methods for unmanned vehicles, and in particular relates to a positioning and navigation method and device based on laser radar and binocular cameras. Background technique [0002] With the diversification of sensor types and the enhancement of computing power, the scope of application of various types of special robots becomes wider. In order to adapt to a wide variety of environments, a single sensor has become a bigger bottleneck for dedicated robots in terms of detection distance and detection accuracy. The fusion of multiple sensors can improve the detection accuracy of robots in complex scenes. [0003] In order to solve the above problems, we have developed a positioning and navigation method and device based on lidar and binocular cameras. Contents of the invention [0004] The object of the present invention is to provide a positioning and navigation method and device based on las...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01S17/02G01S17/93G01C21/00G01C11/08G06T7/73
CPCG01C21/005G01C11/08G06T7/73G06T2207/10028G01S17/86G01S17/931
Inventor 张轶张钊
Owner SICHUAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products