Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic driving lane information detection method based on radar point cloud and image fusion

An image fusion and automatic driving technology, which is applied in the directions of measuring devices, surveying and mapping and navigation, road network navigators, etc., can solve the problem of discontinuous radar point cloud, easy to cause false detection and missed detection, and it is difficult to meet the needs of unmanned driving tasks and other issues to achieve a good effect of robustness and detection success rate

Pending Publication Date: 2022-02-11
荆州智达电动汽车有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the traditional lane detection system, the normal operation of the image-based lane detection algorithm depends on good lighting conditions, and in the face of complex road environments such as vehicles blocking the lack of lane lines, the accuracy of the algorithm model is significantly reduced, and it is difficult to meet the requirements of unmanned vehicles. Driving task requirements; the lane line detection system based on radar point cloud, due to the discontinuity of radar point cloud, it is easy to cause false detection and missed detection, and at the same time, it is difficult to match the model obtained by training the point cloud data collected on different models to other models. On the radar, this has caused a situation where a single lane line detection method cannot 100% meet the needs of lane line detection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic driving lane information detection method based on radar point cloud and image fusion
  • Automatic driving lane information detection method based on radar point cloud and image fusion
  • Automatic driving lane information detection method based on radar point cloud and image fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] To make the above-mentioned purposes, features and advantages of the present invention more obvious and understandable, below in conjunction with the attached Figure 1 ~ Figure 4 The present invention will be further described in detail with specific embodiments.

[0020] In this embodiment, a detection method for automatic driving lane information based on radar point cloud and image fusion is adopted, such as figure 1 The system architecture shown is implemented and passed through figure 2 The data lines shown are connected. The system includes a sensor group composed of lidar, camera and inertial navigation to collect and process data, a core computing unit and an embedded computing unit for data calculation, 5G routing, between sensors in the sensor group, and between sensor groups and core computing Units and embedded computing units are connected through HUB. In this implementation, the embedded computing unit is used to preprocess the data. The core computi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an automatic driving lane information detection method based on radar point cloud and image fusion, and relates to the technical field of automatic driving perception. According to the method, more precise driving semantic information including a center line, a course angle, a stop line and the like is calculated through lane line information detected in real time. The method has an exception handling function, deals with exceptions such as lane line recognition errors and lane line loss, and prevents unknown errors in the running process of the vehicle. The unmanned vehicle can perceive the road semantic information of the surrounding road surface in real time by deploying the method, the vehicle is assisted to carry out a driving task, and the method has very good robustness and detection success rate.

Description

technical field [0001] The invention relates to the technical field of automatic driving perception, in particular to a detection method for automatic driving lane information based on radar point cloud and image fusion. Background technique [0002] The traditional lane line detection method is mainly based on the image method for lane line detection, which is divided into traditional methods and deep learning methods, and the hardware system mainly depends on the camera and the corresponding computing unit. In addition, based on the laser radar lane line detection system, the laser radar sensor sends and receives laser pulses to form a radar point cloud image. The early lane line detection method based on the radar point cloud is to separate the lanes in the point cloud by setting the point cloud reflectivity threshold. Line points and non-lane line points rely mainly on the hardware system of lidar and corresponding computing units. [0003] In the traditional lane line ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V20/56G06V10/26G01C21/32G01C21/36G01S13/91
CPCG01C21/32G01C21/3658G01S13/91
Inventor 张昌杰娄玉强刘凯
Owner 荆州智达电动汽车有限公司
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More