Front-vehicle ranging method based on monocular vision and image segmentation under vehicle-borne camera

A distance measurement method and monocular vision technology, applied in the field of computer vision, can solve the problems of providing security, reducing detection efficiency, and easily introducing errors, etc., to achieve the effect of saving depth calculation time, ensuring driving vision, and reasonable driving judgment

Inactive Publication Date: 2018-11-06
FUZHOU UNIV
View PDF12 Cites 41 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This kind of method uses color to represent the relative depth. In terms of the need to obtain the integrity of the depth, it has certain advantages in the confined space range such as indoor distance measurement. However, for actual traffic scenes, redundant items and The correlation is weak in the process of traffic driving, and the advantage of detecting its depth information in the open space is not great, and it may even reduce the detection efficiency, and there is a small change in the contrast color when the color is used to indicate the distance, it is difficult to judge the specific distance of the vehicle
When the distance between the measured object and the camera is relatively close, the depth can be more clearly represented by the color, but for the distant object, due to the large range of color changes, it is difficult for the naked eye to identify the specific depth information of the distant object and when Vehicle information cannot be given completely when there are many vehicles or vehicles overlap
In real traffic scenarios, using color differences to express relative depth information cannot provide real

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Front-vehicle ranging method based on monocular vision and image segmentation under vehicle-borne camera
  • Front-vehicle ranging method based on monocular vision and image segmentation under vehicle-borne camera
  • Front-vehicle ranging method based on monocular vision and image segmentation under vehicle-borne camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0047] Such as figure 1 As shown, a front vehicle ranging method based on monocular vision and image segmentation under a vehicle camera, including the following steps:

[0048] Step S1: read the image frame by frame from the video stream captured by the vehicle camera;

[0049] Step S2: Carry out object detection on the vehicle, and extract the position information of the vehicle in the image, including two-dimensional bounding box information and three-dimensional bounding box information;

[0050] Step S3: matching the CAD model of the corresponding vehicle in the preset vehicle 3D CAD model library according to the three-dimensional bounding box information;

[0051] Step S4: Determine the distance of the overlapping vehicle according to the two-dimensional bounding box information of the vehicle, and determine the occlusion information of the veh...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a front-vehicle ranging method based on a monocular vision and image segmentation under a vehicle-borne camera. The method comprises a step of extracts a two-dimensional bounding box and a three-dimensional bounding box from a target vehicle based on a depth learning algorithm to obtain corresponding position information, a step of matching a 3D CAD vehicle model based on the length, width and height of the three-dimensional bounding box and obtaining a similar 3D vehicle model of a corresponding vehicle, a step of extracting vehicle model classification information ona vehicle in a picture based on the two-dimensional bounding box, a step of sending 3D information and vehicle model information corresponding to the vehicle to an example segmentation network, and calculating the absolute depth value of the vehicle in the image is calculated according to the size information of different vehicle models according to the camera imaging principle. According to the method, the time of the depth calculation is saved, a driving view is ensured, and so, a driver can intuitively observe a specific distance value of a front vehicle and make a reasonable driving judgment.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a distance measuring method for a preceding vehicle based on monocular vision and image segmentation under a vehicle-mounted camera. Background technique [0002] The popularity of automobiles has increased the world's demand for intelligent vehicles, and the development of machine vision has enabled machines to obtain image information in the same way as human eyes. [0003] Depth estimation based on video images is a prerequisite for realizing automatic driving and safe driving of vehicles. The so-called image depth estimation is a method of obtaining the actual distance of corresponding objects in the real world based on the information of the two-dimensional image stream obtained by the video equipment. Most of the traditional ranging methods need to calibrate the internal and external parameters of the camera and the camera height. The current ranging method based on...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01B11/00G01B11/02G01B11/24G06T7/11G06T7/30G06T7/50G06T7/62G06T7/90G06T17/00
CPCG01B11/00G01B11/02G01B11/24G06T7/11G06T7/30G06T7/50G06T7/62G06T7/90G06T17/00
Inventor 黄立勤陈雅楠
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products