Multi-sensor-fusion-based unstructured environment understanding method

A multi-sensor fusion and unstructured technology, which is applied in the field of unstructured environment understanding based on multi-sensor fusion, can solve the problems of large changes in road width, only one car is allowed to pass, uneven unevenness, and achieve fusion results stable effect

Inactive Publication Date: 2012-07-11
NANJING UNIV OF SCI & TECH
View PDF0 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Due to the following characteristics of rural roads: (1) road surface coverings are various, which may be soil, gravel, asphalt and cement; It can allow one car to pass through; (3) The road boundary is varied, and most of the road boundaries are formed by the intersection of artificial paved roads and natural scenery (plants), so it cannot be soared in a simple straight line, and will change with the seasons; (4) The roughness of the road is poor, and it may be uneven, which affects the driving speed of the vehicle; (5) There are various random static or dynamic obstacles on the road surface
Therefore, the existing environmental understanding methods based on multi-sensor fusion cannot solve the problem of unstructured rural road environmental understanding.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor-fusion-based unstructured environment understanding method
  • Multi-sensor-fusion-based unstructured environment understanding method
  • Multi-sensor-fusion-based unstructured environment understanding method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] combine figure 1 , a method for understanding unstructured environments based on multi-sensor fusion, including the following steps:

[0025] Step 1. Install two or more visual sensors on the top of the vehicle to obtain visual image information of the road ahead; install a 3D lidar sensor on the top of the vehicle, and install two single-line radars on the head of the vehicle to obtain lidar information around the vehicle body ;Since the 3D laser radar is installed on the top of the vehicle, there is a blind area around the vehicle, especially in front of the front of the car, so two single-line radars are installed in front of the front of the car to eliminate the radar blind area around the vehicle, especially in front of the front of the car, and obtain obstacle information around the car body ;

[0026] Step 2. Extract the road edge feature information Rs[i] (i represents the sensor serial number) from the road images obtained by each visual sensor, and stamp the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-sensor-fusion-based unstructured environment understanding method, which comprises the following steps of: firstly, registering and aligning the characteristic information of each vision sensor, and projecting the characteristic information to a current vehicle coordinate system; secondly, extracting fused road edges by adopting a confidence-weighting-based road edge characteristic extraction method; thirdly, performing inter-frame data comparison and judgment on the fused road edges to obtain more stable road edges; fourthly, extracting a passable area from three-dimensional radar data, and fusing the passable area with the stable road edges obtained from the vision sensors to obtain optimal road edge information; and finally performing inter-frame fusion on road edge results to reduce the change of inter-frame data and finally realize stable and reliable understanding in an unstructured environment. A confidence-weighting-based road edge fusion algorithm is adopted, so that the problem of incapability of effectively extracting road edge characteristics under the condition of a single sensor or a single frame of data is solved.

Description

technical field [0001] The invention belongs to the technical field of intelligent information processing, in particular to an unstructured environment understanding method based on multi-sensor fusion. Background technique [0002] The intelligent mobile platform is in a certain environment, and it needs to realize autonomous navigation in an unknown environment. The agent must know what kind of environment it is in, its specific position and orientation, which places around it are safe areas, and where there are dangers, and it must know exactly where the danger is. These must be based on an effective and reliable environment. based on perception. [0003] Because the working environment of the smart mobile platform is not a relatively constant indoor environment such as light and scenery, but an outdoor environment with complex and changeable light, scenery, weather, seasons and geographical location. Moreover, a single sensor has its own limitations, and it is d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/28G06T11/00
Inventor 唐振民陆建峰刘家银诸葛程晨赵春霞杨静宇
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products