Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image based outdoor illumination environment reconstruction method

An environment and pixel technology, applied in 3D image processing, image enhancement, image analysis, etc., can solve the problems of estimation information difference, discontinuity of light and shadow effects of virtual objects, ignoring the correlation between video frames and frames, etc., to achieve smooth light and shadow effect of effect

Inactive Publication Date: 2016-01-06
SICHUAN UNIV
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Furthermore, from the perspective of video processing, most of the existing technologies ignore the correlation between video frames. When performing illumination estimation frame by frame, there are inevitably differences in the estimated information of a single frame, resulting in the generation of virtual images in the video scene. The discontinuity of the light and shadow effects of objects, that is, there are differences in the consistency of virtual and real lighting and virtual and real shadows between adjacent frames in the video

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image based outdoor illumination environment reconstruction method
  • Image based outdoor illumination environment reconstruction method
  • Image based outdoor illumination environment reconstruction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present invention will be further described below in conjunction with specific embodiments.

[0056] A video-based outdoor lighting environment reconstruction method, the method includes the following steps:

[0057] 1. Extract video key frames from the video at equal time intervals, and then use the sky, ground, and vertical surfaces in the video key frame images as clues to estimate the probability distribution map of the position of the sun calculated and inferred by each clue, Combined with the sun position probability obtained from the video sky, ground and vertical surface, the probability distribution map of the sun position in the video key frame scene is deduced, and the sparse radiance map of the video scene key frame is generated;

[0058] 2. Through the lighting parameter filtering algorithm, the lighting estimation result of the key frame of the video is used to correct the lighting estimation result of the non-key frame of the video, and the virtual an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image based outdoor illumination environment reconstruction method, which comprises the steps of extracting video key frames from video according to equal time intervals, using the sky, the ground and a vertical surface in images of the video key frames as clues respectively, estimating a possibility distribution diagram of the sun position which is calculated and deduced by each clue, deducing a possibility distribution diagram of the sun position in a scene of the video key frames by combining the sun position possibility obtained according to the video sky, the ground and the vertical surface, and generating a sparse radiance diagram of the video scene key frames; and correcting an illumination estimation result of video non-key frames by using an illumination estimation result of the video key frames through an illumination parameter filtering algorithm, and realizing virtual light and real light integration of the video scene. According to the method, the lighting effect of generating the virtual and real light integration video is effectively smoothed.

Description

technical field [0001] The invention relates to a video-based outdoor lighting environment reconstruction method, which is used for video virtual-real fusion soundscape, and belongs to the technical field of image processing and enhancement. Background technique [0002] Virtual reality is a research field that has been developing continuously in recent years. Through high-tech with computer science and technology as the core, a realistic virtual environment that is highly similar to the real environment in terms of sight, hearing, and touch is generated, so that users can get immersive experience in the virtual environment. Feelings and experiences of the environment. Traditional virtual reality technology mainly emphasizes virtual scene modeling and virtual scene performance, and seldom directly integrates the virtual environment into the objective real world, which affects the development and application of virtual reality technology to a certain extent. Augmented realit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00G06T7/00G06T7/11G06T15/60
Inventor 刘艳丽
Owner SICHUAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products