Unlock instant, AI-driven research and patent intelligence for your innovation.

Method and device for lane line recognition

A lane line recognition and lane line technology, applied in the field of image processing, can solve problems such as error, only to the pixel level, and the three-dimensional space position of the lane line is not accurate enough, so as to improve the production speed and accuracy, and improve the position recognition accuracy.

Active Publication Date: 2019-04-23
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the registration accuracy of the spatial position relationship between 2D image data and 3D point cloud data, certain errors will occur in the process of converting 2D position information into 3D position information, and the image recognition accuracy can only reach Pixel level, so the three-dimensional space position of the lane line obtained by the existing lane line recognition technology is not accurate enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for lane line recognition
  • Method and device for lane line recognition
  • Method and device for lane line recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0024] figure 1 It is a schematic flow chart of a lane line recognition method provided by Embodiment 1 of the present invention. This embodiment is applicable to large-scale production of high-precision maps. The method can be executed by a lane line recognition device, which can use hardware and / or or in the form of software.

[0025] The method specifically includes the following:

[0026] S110. Identify lane line pixel points in the lane line image.

[0027] In the above operation, the picture of the lane line can be obtained by taking pictures of the lane line on the road by the vehicle-mounted camera. Therefore, the recognition accuracy of lane lines is limited by the pixels of the camera, and is also related to the distance from the camera. Lane lines closer to the camera will be photographed relatively clearly.

[0028] The lane line pixels in the lane line picture can be identified by many methods, preferably, a trained classifier can be used to identify the lane l...

Embodiment 2

[0037] figure 2 It is a schematic flowchart of a lane line recognition method provided in Embodiment 2 of the present invention. The technical solution of this embodiment is based on the foregoing embodiments, and further optimizes the operation of calculating the initial position and direction of the lane line according to the pixel points of the lane line. see figure 2 , the method specifically includes:

[0038] S210. Identify lane line pixels in the lane line image.

[0039] S220. Convert the lane line pixel points in the lane line image into corresponding lane line point cloud points according to the spatial position relationship between the two-dimensional image data and the three-dimensional laser point cloud data.

[0040] The point cloud points of the lane lines obtained in the above operations are limited to the lane line point cloud points corresponding to the lane line pixels in the lane line picture. If a certain point on the lane line is not captured by the ...

Embodiment 3

[0048] image 3 It is a schematic flowchart of a lane line recognition method provided in Embodiment 3 of the present invention. The technical solution of this embodiment is based on the above-mentioned embodiments, and further, the operation of determining the lane line point cloud points based on the road laser point cloud points containing reflectivity intensity information, and the operation of determining the lane line point cloud points based on the lane line point cloud points The operation of correcting the initial position and direction of lane lines has been optimized. see image 3 , the method specifically includes:

[0049] S310. Identify lane line pixels in the lane line picture.

[0050] S320. Calculate the initial position and direction of the lane line according to the lane line pixel points.

[0051] S330. Perform adaptive binary segmentation on the road laser point cloud points containing reflectivity intensity information whose distance from the initial ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a lane line identification method and device. The method includes: identifying lane line pixel points in a lane line picture; calculating the initial position and direction of the lane line according to the lane line pixel points; The road laser point cloud points determine the lane line point cloud points; based on the lane line point cloud points, the initial position and direction of the lane line are corrected. This method corrects the initial position and direction of the lane line through the point cloud of the lane line on the basis of image recognition, and solves the problem that the three-dimensional position of the recognized lane line is not accurate enough in the current lane line recognition technology. , which improves the position recognition accuracy of lane lines, and greatly improves the production speed and accuracy of high-precision maps.

Description

technical field [0001] Embodiments of the present invention relate to image processing technologies, and in particular to a lane line recognition method and device. Background technique [0002] The 3D high-precision map is recognized by the industry and academia as the main development direction of the next generation of digital maps. It is the prerequisite for the realization of automatic driving and assisted driving, and provides the main basis for accurate positioning and correct decision-making of autonomous vehicles. High-precision maps are also an important strategic platform resource for analyzing road utilization conditions and realizing smart transportation. The core issue of 3D high-precision map production focuses on the detection and generation of road lane information, that is, the accurate reconstruction of road network lane information in the real world with a 3D digital map. [0003] The current lane line recognition technology is as follows: first, the pos...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
CPCG06V20/588
Inventor 蒋斌蒋昭炎晏涛何贝
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD