Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Multi-sensor fusion vehicle-road collaborative sensing method for automatic driving

A multi-sensor fusion and automatic driving technology, applied in the field of artificial intelligence, can solve the problems of vehicle-road interconnection real-time communication delay, low positioning accuracy, and low perception accuracy

Pending Publication Date: 2022-07-29
CHINA UNIV OF GEOSCIENCES (BEIJING)
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0012] The purpose of the embodiments of the present invention is to provide a multi-sensor fusion vehicle-road collaborative sensing method for automatic driving, which is used to solve the problems of low environmental perception accuracy, vehicle-road interconnection real-time communication delay, and poor positioning accuracy in urban roads in the prior art. high problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor fusion vehicle-road collaborative sensing method for automatic driving
  • Multi-sensor fusion vehicle-road collaborative sensing method for automatic driving
  • Multi-sensor fusion vehicle-road collaborative sensing method for automatic driving

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order to make the above objects, features and advantages of the present invention more clearly understood, the present invention will be described in further detail below with reference to the accompanying drawings.

[0036] The invention provides a multi-sensor fusion vehicle-road collaborative sensing method for automatic driving, which is used to solve the problems of low environmental perception accuracy, real-time communication delay of vehicle-road interconnection and low positioning accuracy in urban roads in the prior art. like figure 1 As shown, it includes a data enhancement module (1), a point cloud perception module (2), an image perception module (3), a multi-sensor fusion module (4), a V2X real-time communication module (5), a selective compensation module (6), Positioning module based on fusion of SLAM and GPS / INS (7).

[0037] figure 1 It is an overall conceptual diagram of the multi-sensor fusion vehicle-road collaborative sensing method for automa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-sensor fusion vehicle-road collaborative sensing method for automatic driving. The method comprises a data enhancement module, a point cloud sensing module, an image sensing module, a multi-sensor fusion module, a V2X real-time communication module, a selective compensation module and a positioning module based on SLAM and GPS / INS fusion. Firstly, a public data set is processed through a data enhancement module; the three-dimensional information obtained in the point cloud sensing module and the two-dimensional information obtained in the image sensing module are fused through a multi-sensor fusion module; the position information of the vehicle is obtained by means of a positioning module based on SLAM and GPS / INS fusion, and the automatic driving vehicle is helped to make accurate judgment in a complex environment; meanwhile, perception information is shared through the V2X real-time communication module and vehicles or roads in the surrounding environment, shielding missing information is effectively compensated through the selective compensation module, and the real-time communication efficiency is improved; the method is high in accuracy and reliability, and can effectively solve the problems of information loss and shielding under a complex road.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a vehicle-road collaborative perception method based on point cloud and image multimodal fusion. Background technique [0002] With the continuous development of deep learning theory in recent years, many research fields and industries have set off a revolution in artificial intelligence. Among them, autonomous driving technology is deeply influenced by the development of deep learning and computer vision, and its theory has become increasingly mature and is moving towards industrialization. The core of the autonomous driving system can be summarized as three parts: perception, planning, and control, so it is actually a layered structure, and the perception, planning, and control modules play different roles and influence each other. In the entire automatic driving system, the perception part is the forerunner of the entire system, equivalent to the eyes of the a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/56G06V10/26G06V10/80G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08G06T7/10G06T7/70G01C21/16G01S17/931G01S19/49
CPCG06N3/08G06T7/10G06T7/70G01S19/49G01C21/165G01S17/931G06T2207/10028G06T2207/20081G06T2207/20084G06T2207/30252G06N3/045G06F18/241G06F18/25
Inventor 王涛李梅郭林燕
Owner CHINA UNIV OF GEOSCIENCES (BEIJING)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products