Road area detection method

An area detection and road technology, applied in the direction of instruments, character and pattern recognition, computer parts, etc., can solve the problems of limited segmentation accuracy, unbalanced number of training samples, and no distinction between road edge and road center pixels, etc.

Active Publication Date: 2019-06-18
TSINGHUA UNIV
View PDF7 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these DCNN-based methods have better segmentation performance than traditional methods, there are still problems such as rough road edge segmentation, sensitivity to road types, and limited segmentation accuracy.
[0004] Generally speaking, the road area detection for intelligent vehicles at this stage has the following problems: 1) it cannot be generally applied to complex urban road types, including urban lane-free lines, urban single-lane and urban multi-lane; 2) there is no Fully consider the characteristics of pixels in different road areas, and do not distinguish between road edge and road center pixels, resulting in poor segmentation results at road boundaries; 3) In common public datasets, the number of training samples with different road geometries is unbalanced, making it vulnerable to Interference such as road types with different road geometries, resulting in insufficient robustness of road segmentation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Road area detection method
  • Road area detection method
  • Road area detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In the drawings, the same or similar reference numerals are used to denote the same or similar elements or elements having the same or similar functions. Embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0032] In the description of the present invention, the terms "central", "longitudinal", "transverse", "front", "rear", "left", "right", "vertical", "horizontal", "top", " The orientation or positional relationship indicated by "bottom", "inner", "outer", etc. is based on the orientation or positional relationship shown in the drawings, which is only for the convenience of describing the present invention and simplifying the description, rather than indicating or implying the referred device or Elements must have certain orientations, be constructed and operate in certain orientations, and therefore should not be construed as limiting the scope of the invention.

[0033] The road area detection metho...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a road area detection method which comprises the following steps: S1, designing a road enhancement data enhancement method, and generating a road data enhancement function; S2,adopting a road data enhancement function, inputting a training sample, and outputting enhanced road data; S3, designing and training a feature coding network model, and outputting a coding feature map through the feature coding network model by using the enhanced road data; and S4, performing designing and training to generate a road segmentation decoding module and a road type classification decoding module, adopting the road segmentation decoding module and the road type classification decoding module, and inputting the coding feature map to output a road segmentation result and a road type classification result. The road area pixel-level segmentation and multi-type classification results are provided, which can be used for detecting passable areas of intelligent vehicles, and providesa basis for obstacle avoidance and path planning of the intelligent vehicles.

Description

technical field [0001] The invention relates to the field of automatic driving, in particular to a road area detection method based on geometric edge segmentation optimization and deep learning feature encoding and decoding. Background technique [0002] With the rapid development of deep learning technology and intelligent vehicle technology, deep learning has been widely used in environment perception and decision planning in the field of intelligent vehicles. As the core component of the intelligent vehicle environment perception system, road area detection is the basis for intelligent vehicle obstacle avoidance and planning. If camera sensor data is used as input, for each pixel in the image channel, the goal of road area detection is to determine whether the pixel category belongs to the road area or non-road area (obstacle area). Compared with other sensors, monocular cameras have received extensive attention and been deeply researched due to their rich semantic featu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34
Inventor 李克强熊辉余大蒙王建强许庆
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products