Deep image intensification method fused with RGB image information

A technology of RGB image and depth image, which is applied in the field of depth image enhancement that integrates RGB image information, can solve the problems of limited computing speed and low real-time performance of the algorithm, and achieve the effect of improving accuracy and ensuring accuracy

Inactive Publication Date: 2018-08-14
SHANGHAI INST OF TECH
View PDF12 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method is mainly limited by its calculation speed, and the real-time performance of the algorithm is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep image intensification method fused with RGB image information
  • Deep image intensification method fused with RGB image information
  • Deep image intensification method fused with RGB image information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] Below in conjunction with accompanying drawing and specific embodiment, a kind of depth image enhancement method of fusing RGB image information that the present invention proposes is described in further detail. The advantages and features of the present invention will be apparent from the following description and claims. It should be noted that the drawings are all in very simplified form and use imprecise ratios, which are only used to facilitate and clearly assist the purpose of illustrating the embodiments of the present invention.

[0051] see figure 1 , a depth image enhancement method for fusing RGB image information, comprising the following steps:

[0052] S1: Obtain the mapping relationship between the coordinates of the depth image and the RGB image;

[0053] S2: Preprocess the depth image and the RGB image respectively; extract the invalid area in the depth image and mark the pixel position of the invalid area;

[0054] Carry out edge detection to the R...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a deep image intensification method fused with RGB image information. The deep image intensification method comprises the steps of S1, obtaining a mapping relation of coordinates of a deep image and an RGB image; S2, preprocessing the deep image and the RGB image separately; extracting an invalid area in the deep image and labeling the pixel position of the invalid area; conducting edge detection on the RGB image, extracting edge information, and comparing the edge information with the preprocessed deep image to determine an effective supporting edge used for repairingthe deep image; S3, estimating the deep information of the invalid area; along a boundary of the invalid area in the deep image, using information of the effective supporting edge to conduct depth calculation layer by layer from outside to inside; S4, conducting filtering and noise reduction optimizing on a miniature isolation area from which the effective supporting edge cannot be extracted in the deep image to improve the precision of the deep image. By means of the deep image intensification method fused with the RGB image information, the invalid area in the deep image can be repaired on the premise of guaranteeing that the edge is clear, and the sharpening degrees of edges of all deep missing areas can be improved.

Description

technical field [0001] The invention belongs to the field of image processing, in particular to a depth map enhancement method for fusing RGB image information. Background technique [0002] In recent years, the application scenarios of mobile robots have become more and more extensive, and have penetrated into many fields such as industry and agriculture, home services, medical care, and aerospace. Simultaneous Localization and Mapping (SLAM) refers to carrying specific sensors on some mobile platforms such as robots, and collecting sensors at various moments during the movement without prior information of the environment. Information, estimate its own trajectory, and build an environment map that fits the application scenario. SLAM technology provides accurate positioning and maps so that the robot can move autonomously in the room. It is not only used in the field of robotics, but also the core technology in related fields such as virtual reality and augmented reality....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/00G06T7/13G06T7/32
CPCG06T5/003G06T5/008G06T2207/10024G06T2207/10028G06T7/13G06T7/32
Inventor 李文举胡文康韦丽华沈子豪
Owner SHANGHAI INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products