Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vehicle-road collaborative awareness and data fusion method, medium and automatic driving system

A technology of data fusion and vehicle-road coordination, which is applied in the direction of road network navigators, navigation, instruments, etc., can solve the problem that the effect of collaborative perception and data fusion application at both ends of the vehicle road is difficult to reach the industrial level, and the communication delay between the road end and the vehicle end The data fusion algorithm at both ends of the problem is not mature enough, it is difficult to meet the needs of industrial applications, etc., to achieve the effect of resisting bad weather and road conditions, stable application effect, and fast sensing speed

Pending Publication Date: 2022-04-29
上海智能网联汽车技术中心有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] It is difficult for the existing technology to achieve the application effect of collaborative sensing and data fusion at both ends of the vehicle and road, and the communication delay problem between the road end and the vehicle end and the data fusion algorithm at both ends are not mature enough
When an autonomous vehicle is driving on the road, the surrounding environment changes rapidly, and the accuracy and timeliness of the environmental information are very high. If the data processing speed at both ends of the vehicle road is not fast enough, it will affect the decision-making of the automatic driving system. Increased risk of autonomous driving
[0004] At present, the existing vehicle-road collaborative perception fusion technology is difficult to meet the needs of industrial applications, and further research and development is necessary

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vehicle-road collaborative awareness and data fusion method, medium and automatic driving system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] This embodiment provides a vehicle-road cooperative sensing and data fusion method, which is applied to a vehicle-mounted processor and includes the following steps:

[0033] 1) Obtain multiple sensor data, perform data fusion through neural network algorithms, and obtain the first summary data as vehicle-side summary data. Among them, the sensors include vehicle-side millimeter-wave radar, vehicle-side laser radar and vehicle-side cameras, and neural network algorithms Fusion algorithms built for deep learning models;

[0034] 2) Obtain real-time information collected by the roadside unit RSU, including position, speed and heading angle;

[0035] 3) The first summary data is fused with the real-time information through the neural network algorithm based on Kalman filter to obtain the second summary data, which can be used as the basic data for decision-making and execution of the automatic driving system.

[0036] In a specific implementation manner, the deep learning...

Embodiment 2

[0040] This embodiment provides an automatic driving system, including a roadside unit, a vehicle-mounted unit, and a vehicle-mounted processor connected in sequence. The vehicle-mounted unit and the roadside unit are connected through 5G wireless communication. The vehicle-mounted processor is connected to various sensors, and the A computer program that, when invoked, performs the following operations:

[0041] Obtain a variety of sensor data, perform data fusion through neural network algorithms, and obtain the first summary data, in which the sensors include car-end millimeter-wave radar, car-end laser radar and car-end camera;

[0042] Obtain real-time information collected by roadside units, including position, speed and heading angle;

[0043] fusing the first aggregated data with real-time information to obtain second aggregated data;

[0044] An automatic driving strategy is generated based on the second aggregated data.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a vehicle infrastructure collaborative awareness and data fusion method, a medium and an automatic driving system, the method is applied to a vehicle-mounted processor, and the method comprises the following steps: obtaining various sensor data, and carrying out data fusion through a neural network algorithm to obtain first summarized data; acquiring real-time information acquired by the road side unit; and fusing the first summarized data with the real-time information to obtain second summarized data. Compared with the prior art, the method can sense the intersection road traffic image and the millimeter wave radar point cloud data at the same time, enables the vehicle end and the road end to rapidly sense the surrounding environment information in a real-time cooperation manner, and has the advantages of improving the reliability of data collection, improving the recognition effect of subsequent application, being capable of resisting bad weather and road conditions and the like.

Description

technical field [0001] The invention belongs to the technical field of automatic driving, and in particular relates to a vehicle-road cooperative perception and data fusion method, a medium and an automatic driving system. Background technique [0002] Existing automatic driving often detects the surrounding environment information by the vehicle end, the road end or both ends of the vehicle road, and then provides the data required for automatic driving. Perception devices usually use one or more of visual perception technology (camera), lidar perception technology or millimeter-wave radar perception technology. Therefore, the information obtained by multiple sensors at both ends of the vehicle road needs to be fused and processed before it can be handed over to the computer system. Make decisions. [0003] It is difficult for the existing technology to achieve the application effect of collaborative sensing and data fusion at both ends of the vehicle and road to the indus...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/00G01C21/34
CPCG01C21/005G01C21/3407
Inventor 张宇超秦超林新雨王凤军
Owner 上海智能网联汽车技术中心有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products