Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-exposure image fusion method based on depth perception enhancement

An image fusion and depth perception technology, applied in the field of HDR image generation and image fusion, to achieve the effects of rich detailed information, good global structure consistency, good computing efficiency and scalability

Active Publication Date: 2021-12-21
TIANJIN UNIV
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In addition, the existing multi-exposure fusion schemes often first transfer the image to the color space where the brightness and chroma are separated, and only design the fusion strategy in the brightness channel.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-exposure image fusion method based on depth perception enhancement
  • Multi-exposure image fusion method based on depth perception enhancement
  • Multi-exposure image fusion method based on depth perception enhancement

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0076] The present invention will be further described below in conjunction with accompanying drawing and specific embodiment, but following embodiment does not limit the present invention in any way.

[0077] A multi-exposure image fusion method based on depth perception enhancement proposed by the present invention, such as figure 1 shown, including the following steps:

[0078] Step 1. For two source images I 1 and I 2 Design a two-way detail enhancement method for enhancement, obtain multiple detail enhancement reference images respectively, and form a sequence of detail enhancement reference images; figure 2 As shown, the process is as follows:

[0079] 1.1 Forward enhancement: Decompose the source image I: Where R and E represent the scene detail component and the exposure component, respectively; The operator represents an element-wise multiplication operation; a simple transformation yields in It is obtained by taking the reciprocal of E element by element....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-exposure image fusion method based on depth perception enhancement. Firstly a bidirectional contrast enhancement strategy of an image is provided, the enhancement processing of each source image is provided, and a series of detail-enhanced reference images are obtained; and a detail enhancement network DEM is constructed, a loss function is designed, each reference image is weighted according to information richness to serve as an optimization target, and the detail enhancement network is trained, so that fusion brightness containing rich details is obtained. A color enhancement network CEM is constructed, and a loss function is designed, so that the color enhancement network CEM learns a mapping relationship of colors with respect to brightness in different scenes, thereby realizing color rendering for brightness fusion. The two networks respectively generate brightness and color components to jointly complete generation of a fused image. The multi-exposure fusion image obtained by using the method not only has rich detail information and good global structure consistency, but also has more real and vivid color information compared with the existing method, and the visual quality of the image is remarkably improved.

Description

technical field [0001] The invention relates to a computer image processing method, in particular to an image fusion and HDR image generation method. Background technique [0002] When taking pictures with mobile phones or cameras, we often encounter a situation: no matter how to adjust parameters such as exposure time and aperture size, it is difficult to take an image with optimal exposure. Even if the overall image is well exposed, there will always be some areas that are overexposed or underexposed. This is because the dynamic range in natural scenes is often far greater than the dynamic range that can be recorded by consumer-grade cameras, so a single low dynamic range (LDR) image can only record information of a limited dynamic range, and cannot fully restore the scene. full information. In order to fully record the scene and obtain high-quality images, high dynamic range (HDR) imaging technology came into being. The realization of high dynamic range imaging technol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T5/00G06N3/04G06N3/08
CPCG06T5/50G06N3/08G06N3/088G06T2207/10004G06T2207/20208G06T2207/20221G06N3/045G06T5/90Y02T10/40
Inventor 郭晓杰韩东
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products