Infrared image and radar data-based night unmanned parking lot scene depth estimation method

A radar data and scene depth technology, applied in image data processing, image enhancement, image analysis, etc., can solve the problems of manual selection of features, inability to mine deep image feature information, etc., to achieve the effect of ensuring accuracy

Inactive Publication Date: 2017-07-25
DONGHUA UNIV
View PDF4 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method first obtains a series of small areas in the infrared image with similar texture and brightness through superpixel segmentation, that is, superpixels; then trains the PP-MRF model to establish the nonlinear relationship between the superpixel panel parameters and their corresponding depths, so as to realize Estimating the given superpixel depth information, the disadvantage of this method is that it needs to manually select features and cannot mine deep feature information of the image.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared image and radar data-based night unmanned parking lot scene depth estimation method
  • Infrared image and radar data-based night unmanned parking lot scene depth estimation method
  • Infrared image and radar data-based night unmanned parking lot scene depth estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] Below in conjunction with specific embodiment, further illustrate the present invention. It should be understood that these examples are only used to illustrate the present invention and are not intended to limit the scope of the present invention. In addition, it should be understood that after reading the teachings of the present invention, those skilled in the art can make various changes or modifications to the present invention, and these equivalent forms also fall within the scope defined by the appended claims of the present application.

[0021] figure 1 Shown is the flow chart of scene depth estimation for unmanned vehicles at night based on infrared images and radar data. The method of scene depth estimation for unmanned vehicles at night based on infrared images and radar data first needs to process the radar data with default values, and then perform The classification operation obtains the depth category corresponding to the night vision image. Then const...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides an infrared image and radar data-based night unmanned parking lot scene depth estimation method. According to the method, firstly, a night vision image data set is established, wherein the night vision image data set comprises original sample images and radar data obtained through pre-classifying the original sample images. The original sample images and the radar data are written into corresponding text files. Secondly, a depth convolution / reverse convolution neural network is constructed and trained by utilizing the night vision image data set. Thirdly, a to-be-processed image is acquired in real time, and the to-be-processed image is input into the depth convolution / reverse convolution neural network. Through the depth convolution neural network, a feature map is obtained. The feature map is input into the reverse convolution neural network, and then the category of each pixel point in the feature map can be obtained. After that, a probability map is outputted. Finally, the probability map is subjected to anti-log transformation to obtain the estimated depth of each pixel point. The test proves that the method provided by the invention can effectively estimate the depth of a night scene. Meanwhile, the estimation correctness and the estimation real-time performance are ensured.

Description

technical field [0001] The invention relates to a method for estimating the scene depth of an unmanned vehicle at night based on infrared images and radar data. The method can estimate the spatial position information of the scene in the infrared image. Background technique [0002] Image depth estimation refers to obtaining depth and distance information from images, which is essentially a depth perception problem. Restoring the 3D depth information of a scene from one or more images of the scene is a basic research topic in the field of machine vision, and has important applications in the fields of robot motion control, scene understanding, and scene reconstruction. [0003] Depth estimation techniques mainly include binocular depth cues and image sequence-based depth estimation methods, both of which rely on feature differences between images. For monocular image depth estimation, the classic method in early research is "shape from shading". This algorithm is based on s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/50
CPCG06T2207/10044G06T2207/10048G06T2207/20084
Inventor 姚广顺孙韶媛叶国林高凯珺
Owner DONGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products