Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular vision-based dense point cloud reconstruction method and system for triangulation measurement depth

A monocular vision and triangulation technology, applied in the field of unmanned driving environment perception, can solve the problems of high cost, insufficient image processing, and insufficient map area processing

Pending Publication Date: 2020-10-20
DALIAN UNIV OF TECH
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, the creation of point cloud maps is basically based on the combination of laser sensors and monocular cameras to directly measure the depth map and color map, and then use the camera model to obtain point cloud maps according to the sensor pose, or use methods such as multi-sensor fusion. , the algorithm architecture adopted by these methods is relatively simple and direct, so it will lead to insufficient image processing, insufficient processing of edge information or image areas with large differences in depth values, and the cost of multi-sensor fusion is relatively expensive

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vision-based dense point cloud reconstruction method and system for triangulation measurement depth
  • Monocular vision-based dense point cloud reconstruction method and system for triangulation measurement depth
  • Monocular vision-based dense point cloud reconstruction method and system for triangulation measurement depth

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0057] It should be noted that the terms "first" and "second" in the description and claims of the present invention and the above drawings are used to distinguish similar objects, but not necessarily used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a monocular vision-based dense point cloud reconstruction method and system for triangulation depth measurement. The monocular vision-based dense point cloud reconstruction method comprises the steps of reading an image in a data set; matching the pixel points in the read front and back frame images by adopting an epipolar search and block matching algorithm; calculating thedepth value of the matched pixel point through a triangulation principle; performing filtering processing on the depth values of the pixel points by adopting a depth filter in Gaussian distribution,and keeping the depth value of the pixel point with the minimum uncertainty; segmenting the image into 4 * 4 pixel blocks; if the depth value difference of each pixel point in each pixel block does not exceed a set threshold value, regarding the whole pixel block as a point cloud point; and if the depth value difference of each pixel point in each pixel block exceeds the set threshold value, regarding the pixel points with the maximum depth value and the minimum depth value in the block as two point cloud points; converting the pixel coordinates into world coordinates by using a camera model,and generating point cloud data in combination with the depth information. The problem that conventional point cloud map creation needs multiple sensors or expensive sensors is solved.

Description

technical field [0001] The present invention relates to the technical field of unmanned driving environment perception, in particular, to a dense point cloud reconstruction method and system based on monocular vision triangulation measurement depth. Background technique [0002] At present, the creation of point cloud maps is basically based on the combination of laser sensors and monocular cameras to directly measure the depth map and color map, and then use the camera model to obtain point cloud maps according to the sensor pose, or use methods such as multi-sensor fusion. , the algorithm architectures adopted by these methods are relatively simple and direct, which will lead to insufficient image processing, insufficient processing of edge information or image areas with large differences in depth values, and the cost of multi-sensor fusion is relatively expensive. The dense point cloud reconstruction method based on monocular vision triangulation measurement depth can so...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/55G06T17/05
CPCG06T7/55G06T17/05
Inventor 赵一兵马振强郭烈杨宇周一飞吕彦卿韩治中刘昌华
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products