Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Road drivable area detection method based on fusion of monocular vision and lidar

A lidar and monocular vision technology, applied in the field of intelligent transportation, can solve the problems of sparse point cloud data, vulnerable to light conditions, and high time consumption, and achieve the effect of strong robust performance, high algorithm efficiency, and narrowing of the scope.

Active Publication Date: 2019-05-03
XI AN JIAOTONG UNIV
View PDF6 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, the monocular vision method only considers the visual information of the scene, which is easily affected by lighting conditions and weather conditions; the stereo vision method consumes a lot of time in 3D reconstruction and is not suitable for practical applications; the lidar method has point cloud data Sparse Disadvantages

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Road drivable area detection method based on fusion of monocular vision and lidar
  • Road drivable area detection method based on fusion of monocular vision and lidar
  • Road drivable area detection method based on fusion of monocular vision and lidar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] refer to figure 1 As shown in Fig. 1, the existing improved linear iterative clustering method combined with edge segmentation is used to perform superpixel segmentation on the picture collected by the camera, and the picture is divided into N superpixels, each superpixel p c =(x c ,y c ,z c ,1) T Contains several pixels, where x c ,y c ,z c Indicates the average value of the position information of all pixels in the superpixel in the camera coordinate system, and at the same time, the RGB of these pixels are unified as the average RGB of all pixels in the superpixel. Reuse existing calibration techniques and rotation matrices and transformation matrix According to the formula (1) to get the conversion matrix

[0043]

[0044] Use the rotation matrix and Establish the conversion relationship between the two coordinate systems, such as formula (2):

[0045]

[0046] Each point p obtained by the lidar l =(x l ,y l ,z l ,1) T Projected onto the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular vision and laser radar fusion-based road travelable region detection method and belongs to the intelligent transportation field. Existing unmanned vehicle road detection methods are mainly based on methods such as monocular vision, stereo vision, laser sensor and multi-sensor fusion methods, have defects of low robustness to illumination, complex three-dimensional matching, laser sparseness, low overall fusion efficiency and the like. Although some supervised methods have achieved better accuracy, the training processes of the supervised methods are complex, and the generalization effects of the supervised methods are poor. According to the monocular vision and laser radar fusion-based road travelable region detection method provided by the present invention, ultra-pixel and point cloud data fusion is adopted; on the basis of features, road regions can be obtained through machine self learning; and the features are fused through the Bayesian frame, so that road information is obtained, and a final region can be obtained. With the method adopted, strong hypothesis information and complex training processes are not required. The monocular vision and laser radar fusion-based road travelable region detection method has the advantages of excellent generalization performance, high robustness, fast speed and high precision, and can be popularized and used more easily in practical application.

Description

technical field [0001] The invention belongs to method research in the field of intelligent transportation, and relates to a road drivable area detection method based on the fusion of monocular vision and laser radar. Background technique [0002] In recent years, road detection has been an important part of research in the field of driverless driving. Currently widely used road detection methods are: monocular vision method, stereo vision method, lidar method and fusion-based method. Among them, the monocular vision method only considers the visual information of the scene, which is easily affected by lighting conditions and weather conditions; the stereo vision method consumes a lot of time in 3D reconstruction and is not suitable for practical applications; the lidar method has point cloud data Sparse disadvantage. The road detection method based on the fusion of pixel information and depth information not only makes full use of the texture, color and other information ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01S17/02
CPCG01S17/86
Inventor 郑南宁余思雨刘子熠
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products