Method for fusing a color depth image and a gray scale depth image

A depth image and color image technology, applied in the fields of image processing, computer vision and human-computer interaction, can solve the problems that it is difficult to fully reflect the depth information of the physical space, the color image does not contain the depth information of the physical space, and the scene information is limited, etc., to achieve The depth map is rich in information, the scene details are eye-catching, and the effect is easy for human eyes to observe

Active Publication Date: 2019-04-12
XI AN JIAOTONG UNIV
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The human eye's ability to distinguish color information is much greater than that of grayscale information. The grayscale depth image is a monochrome image, and the scene information provided by it and the pseudo-color depth image is limited, and it is difficult to fully reflect the depth information of the physical space.
The color image conforms to the observation mode of the human eye, but the color image does not contain the depth information of the physical space

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for fusing a color depth image and a gray scale depth image
  • Method for fusing a color depth image and a gray scale depth image
  • Method for fusing a color depth image and a gray scale depth image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The technical solution of the present disclosure will be described in detail below with reference to the drawings and embodiments.

[0037] see figure 1 , a method for fusing color and grayscale depth images, comprising the steps of:

[0038] S100: Calibrate the internal and external parameters of the depth sensor and the RGB sensor of the 3D depth perception device;

[0039] S200: Obtain a gray-scale depth image through the depth sensor, and map the depth value of each pixel in the gray-scale depth image to the YU or YV channel in the YUV image format to generate a corresponding YUV depth image, and at the same time, through the The RGB sensor acquires RGB color images;

[0040] S300: Convert the YUV depth image in step S200 into an RGB depth image, and output it after compressing and coding with the RGB color image;

[0041] S400: Decompress the compressed RGB depth image and RGB color image in step S300, convert the decompressed RGB depth image into a YUV depth imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for fusing a color depth image and a gray scale depth image. The method comprises the following steps: calibrating internal and external parameters of a depth sensor and an RGB sensor of three-dimensional depth sensing equipment; a gray-scale depth image is acquired through a depth sensor, a corresponding YUV depth image is generated, and meanwhile, an RGB color image is acquired through an RGB sensor; converting the YUV depth image into an RGB depth image, compressing and encoding the RGB depth image and the RGB color image, and outputting the RGB depth image;decompressing the RGB depth image subjected to compressed encoding and the RGB color image, converting the decompressed RGB depth image into a YUV depth image, and recovering the depth value of eachpixel in the gray scale depth image mapped to the YU or YV channel; carrying out registration on the YUV depth image and the RGB color image; and fusing the depth value of each pixel in the restored gray scale depth image and the texture information of the registered RGB color image to generate a color and gray scale fused depth image.

Description

technical field [0001] The disclosure belongs to the technical fields of image processing, computer vision and human-computer interaction, and in particular relates to a method for fusing color and grayscale depth images. Background technique [0002] Vision is the most direct and main way for human beings to observe and perceive the world. We live in a three-dimensional world. Human vision can not only perceive the brightness, color, texture information, and movement of an object's surface, but also judge its shape, space, and spatial position (depth, distance). The real-time acquisition of in-depth information is conducive to the interaction between the real physical world and the virtual network world, strengthens communication and learning between people, people and machines, and machines and machines, and enhances the intelligence level of machines. [0003] The high-precision depth information (distance) in the projected space can be obtained in real time through the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/50G06T7/80
CPCG06T5/50G06T2207/10024G06T2207/10028G06T2207/20221G06T7/80
Inventor 姚慧敏葛晨阳王佳宁邓作为刘欣
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products