Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An illumination estimation method based on monocular camera

A technology for monocular camera and illumination estimation, applied in the fields of computer graphics and artificial intelligence, can solve the problems of inability to estimate light source information, limitations, and inapplicability of only monocular cameras, etc., to avoid insufficient precision and improve effectiveness Effect

Active Publication Date: 2019-03-26
SUN YAT SEN UNIV
View PDF9 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The current mainstream augmented reality technology only achieves a rough estimate of the ambient light intensity in a given camera image (such as Google's ARCore platform), and cannot estimate the light source information in all directions in the real environment.
There are also some studies that can estimate the light source information in all directions in the real environment, but it is necessary to capture the light information with the help of external hardware devices or light probes that invade the lens
However, existing lighting estimation techniques are limited to additional hardware devices or invasive light probes, and are not suitable for mobile devices with only monocular cameras.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An illumination estimation method based on monocular camera
  • An illumination estimation method based on monocular camera
  • An illumination estimation method based on monocular camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0020] In this embodiment, the overall flow of the illumination estimation method based on the monocular camera is as follows figure 1 shown, including the following steps:

[0021] In the first step, the monocular camera collects RGB images as input for depth estimation;

[0022] The second step is to construct a convolutional neural network for monocular camera depth estimation, and use the public data set to train the convolutional neural network; input the RGB image to the convolutional neural network whose parameters have been trained for depth estimation, and obtain the depth Prediction value, output depth prediction map;

[0023] In the depth estimation process of this step, first construct a convolutional neural network to predict the depth information in the monocular RGB image, and the training data is from the KITTI public data set, which is a collection of RGB images with depth annotations. After the training is completed, an RGB image is input, and an RGBD image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to computer graphics, which is an illumination estimation method based on a monocular camera. The method comprises the following steps: a monocular camera acquires an RGB image as an input of a depth estimation; Convolution neural network for depth estimation of monocular camera is constructed and trained. The RGB image is input to the trained convolution neural network for depth estimation, and the depth prediction value is obtained, and the depth prediction map is output. The depth prediction value is upsampled to match the size of the depth prediction map with the RGBimage, and the upsampled depth prediction value is used as the input of the illumination estimation. The RGB image is transformed into CIELab color space, and the luminance channel information is usedas the input of illumination estimation. The spherical harmonic coefficients of the light source information in all directions of the real scene can be obtained by using the luminance channel information and the depth prediction value to estimate the illumination. This method uses monocular camera to obtain the light source information in each direction of the real scene, and effectively improvesthe fidelity of rendering virtual object in AR technology.

Description

technical field [0001] The invention relates to the fields of computer graphics and artificial intelligence, in particular to an illumination estimation method based on a monocular camera. Background technique [0002] There are two main categories of existing techniques for illumination estimation. One is to obtain additional scene information with the help of external hardware devices such as fisheye cameras and depth cameras, so as to estimate the lighting conditions of the real scene. This type of lighting estimation technology requires external hardware devices, which will undoubtedly increase the cost of lighting estimation, and mobile devices such as mobile phones need to consider portability, and it is almost impossible to configure these hardware devices. The other is to place some regular objects with known reflection characteristics in the scene and shoot them, and estimate the lighting information according to the lighting effect of the regular objects in the im...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T15/00G06T7/593G06N3/04
CPCG06T7/593G06T15/005G06T2207/10012G06N3/045
Inventor 纪庆革林辉钟圳伟
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products