Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An unmanned driving path selection method based on multi-sensor cooperation

An unmanned driving and multi-sensor technology, applied in the direction of neural learning methods, instruments, biological neural network models, etc., can solve the problems of low information recognition accuracy and difficult resource allocation, so as to improve recognition accuracy and solve the problem of excessive calculation , the effect of high-precision path planning

Pending Publication Date: 2019-06-14
TIANJIN UNIV
View PDF6 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides an unmanned driving path selection method based on multi-sensor cooperation. The present invention can effectively overcome the problem of low recognition accuracy of traditional road information, and reduce the problem of resource allocation difficulties caused by excessive multi-sensor calculations. In order to achieve the goal of high-precision road recognition, see the following description for details:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An unmanned driving path selection method based on multi-sensor cooperation
  • An unmanned driving path selection method based on multi-sensor cooperation
  • An unmanned driving path selection method based on multi-sensor cooperation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] A method for unmanned driving path selection based on multi-sensor cooperation, see figure 1 with figure 2 , the method includes the following steps:

[0037] 101: Arrange multiple sensors around the vehicle body to achieve full coverage of information around the vehicle body and obtain all basic information around the vehicle body;

[0038] Among them, the existing unmanned driving sensors generally use perception sensors and high-sensitivity cameras. The embodiment of the present invention uses multi-cooperative sensors and high-sensitivity cameras, and adjusts and replaces the sensor distribution according to the importance of different positions of the vehicle body in driving recognition. The specific sensor layout is as follows: the sensor must be able to ensure 360-degree complete coverage And according to the importance of different parts, the detection distance of front, rear, left and right is set reasonably. Usually, the detection distance of the front is g...

Embodiment 2

[0050] Combine below Figure 1 to Figure 5 , the specific example further introduces the scheme in embodiment 1, see the following description for details:

[0051] 201: specific arrangement of sensors;

[0052] When implementing it, it is usually necessary to balance coverage and redundancy. Such as image 3 As shown in the figure, laser radars are arranged in the front, rear, roof and both sides of the headlights of the car; millimeter-wave radar systems are arranged near the front and rear headlights and the fuel tank cap; a 360-degree real-time capture of visual information is installed on the roof of the car. Sensitivity high-speed camera and sound collection aids. And considering the inaccuracy of a single sensor, each part needs more than two kinds of sensors to cover.

[0053] 202: Make a classification and division of the road information around the car body obtained by the sensor, the distance from the target around the car body, and the driving of the car itself...

Embodiment 3

[0068] Below in conjunction with specific examples, the scheme in embodiment 1 and 2 is verified for feasibility, see the following description for details:

[0069] This experiment uses the FCN (Fully Convolutional Neural Network) model, which is compared with the CNN (Convolutional Neural Network) model in the urban interior, urban roads, field roads, and rural roads. The experimental results show that this method has certain advantages, not only the recognition accuracy has been improved, but also the time processing has been improved to a certain extent.

[0070] At the same time, this method uses the Adam algorithm to optimize the learning rate, uses the method of simultaneously updating the entire network and increasing a certain learning rate for training, draws the loss learning change curve, and obtains the optimal learning rate. In the backpropagation update parameter phase, using the Adam algorithm, β 1 with beta 2 Take 0.9 and 0.999 respectively. Under this algo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an unmanned driving path selection method based on multi-sensor cooperation, which comprises the following steps of arranging a multi-cooperation sensor and a high-sensitivitycamera around a vehicle body, performing sensor distribution adjustment and replacement according to the importance of different positions of the vehicle body in driving recognition, and obtaining allbasic information around the vehicle body; analyzing the coverage and redundancy of a sensor to obtain the driving condition of an automobile, and taking an image captured by a high-sensitivity camera as the input of a full convolutional neural network; performing nonlinear mapping on the data in an encoding stage by using a full convolutional neural network; in the decoding stage, adopting the nonlinear excitation, achieving the rough segmentation of an image through an image prediction channel, using an Adam algorithm, and optimizing the hyper-parameters, reasoning a road and finally obtaining a road segmentation model; and on the basis of the segmented roads, improving the recognition precision in combination with the comprehensive perception of a driving route segmentation model in the convolutional neural network, thereby realizing the high-precision route planning.

Description

technical field [0001] The invention relates to the technical field of road segmentation and target recognition, in particular to a multi-sensor cooperation-based unmanned driving path selection method. Background technique [0002] A single sensor cannot meet the needs of unmanned driving. GPS is relatively accurate, but the update frequency is low, which cannot meet the requirements of real-time road recognition; the error of the inertial sensor will increase with the running time. Therefore, it is necessary to find a way that can integrate the advantages of multiple sensors, and reasonably process and fuse the data collected to obtain a more real-time and accurate positioning. [0003] In image segmentation, convolutional neural networks are usually used for road segmentation. Although it can provide a good outline of the object, it causes some details to be lost. Therefore, it is difficult to use data from various sensors for accurate segmentation. High and in urgent ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06N3/04G06N3/08G06Q10/04G06Q50/30
Inventor 王建荣万里于健徐天一高洁喻梅于瑞国宛奥深
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products