Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-source lane line fusion method and device, vehicle and storage medium

A fusion method and lane line technology, applied in the computer field, can solve problems such as the inability to guarantee detection stability and accuracy, and achieve the effect of improving stability and accuracy

Pending Publication Date: 2022-02-11
UISEE SHANGHAI AUTOMOTIVE TECH LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the process of implementing the present invention, the inventor found that the existing technology has the following defects: when only the unique visual perception module in the vehicle is used to provide lane lines, the stability and accuracy of detection cannot be guaranteed. When the module fails temporarily, it will bring safe driving risks to the self-driving vehicle

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-source lane line fusion method and device, vehicle and storage medium
  • Multi-source lane line fusion method and device, vehicle and storage medium
  • Multi-source lane line fusion method and device, vehicle and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] Figure 1a It is a flowchart of a multi-source lane line fusion method provided by Embodiment 1 of the present invention. This embodiment is applicable to the case where a vehicle equipped with multiple visual sensors performs lane line fusion on detected lane lines corresponding to each visual sensor. The method of this embodiment can be executed by a fusion device of multi-source lane lines, which can be implemented by software and / or hardware, and can generally be integrated in the vehicle-machine end of the vehicle. Typically, a vehicle with an automatic driving function In the engine end of the vehicle.

[0031] Correspondingly, the method specifically includes the following steps:

[0032] S110. Obtain in real time the detected lane lines respectively corresponding to the visual sensors in the vehicle, where multiple visual sensors are set in the vehicle.

[0033]Among them, the visual sensor can have the ability to capture thousands of pixels of light from an e...

Embodiment 2

[0058] Figure 2a It is a flow chart of another multi-source lane line fusion method provided by Embodiment 2 of the present invention. This embodiment is optimized on the basis of the above-mentioned embodiments. In this embodiment, Kalman filtering is used to calculate the predicted lane line at the current detection time of the fused lane line at the historical detection time, and update to obtain the fusion at the current detection time. lane line.

[0059] Correspondingly, the method specifically includes the following steps:

[0060] S210. Obtain in real time the detected lane lines corresponding to the respective visual sensors in the vehicle, wherein multiple visual sensors are set in the vehicle.

[0061] S220 , judging whether it is detected that the detected lane line corresponding to the target visual sensor has been updated: if yes, execute S230 ; otherwise, return to execute S210 .

[0062] S230. According to the vehicle state information at the current detect...

Embodiment 3

[0133] Figure 3a It is a flow chart of another multi-source lane line fusion method provided by Embodiment 3 of the present invention. This embodiment is optimized on the basis of the above-mentioned embodiments. In this embodiment, according to the detected lane line at the current detection time of the target visual sensor and the predicted lane line, update and obtain the current detection time. After merging the lane lines, the method further includes: according to the fused lane lines at the current detection moment, screening the target lane lines to be used as a driving decision reference during the automatic driving process of the vehicle.

[0134] Correspondingly, the method specifically includes the following steps:

[0135] S310. Obtain in real time the detected lane lines corresponding to the respective visual sensors in the vehicle, where multiple visual sensors are set in the vehicle.

[0136] S320. If it is detected that the detected lane line corresponding t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a multi-source lane line fusion method and device, an unmanned vehicle and a storage medium. The method comprises the steps: acquiring detection lane lines corresponding to all visual sensors in a vehicle in real time, wherein the multiple visual sensors are arranged in the vehicle; if it is detected that the detection lane line corresponding to the target visual sensor is updated, calculating a predicted lane line of a fusion lane line at a historical detection moment at the current detection moment; and according to the detection lane line of the target visual sensor at the current detection moment and the predicted lane line, updating to obtain a fusion lane line at the current detection moment. According to the technical scheme provided by the embodiment of the invention, a new mode of performing sequential fusion on the non-homologous lane lines corresponding to the plurality of visual sensors respectively is provided, and the stability and the precision of lane line detection are improved.

Description

technical field [0001] Embodiments of the present invention relate to computer technology, specifically to unmanned driving and artificial intelligence technology, and in particular to a multi-source lane line fusion method, device, vehicle and storage medium. Background technique [0002] With the advancement of science and technology, the assembly volume of the L2 level assisted driving system defined by the American Society of Automotive Engineers SAE has increased rapidly in recent years. Most of the functions include the two functions of lane center keeping and full speed adaptive cruise. Horizontal and vertical control. [0003] The lane keeping system mainly relies on the lane lines for control. As the system reliability requirements become higher and higher, systems with redundant architectures will become more and more popular. For a single-camera system, the camera is directly used to sense the lane lines for lateral control. At present, most lateral control on t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V20/56G06V10/80
CPCG06F18/25
Inventor 王子涵李国政
Owner UISEE SHANGHAI AUTOMOTIVE TECH LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products