Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Two-channel exposure fusion network model and method for low-light image enhancement

An image enhancement and fusion network technology, applied in image enhancement, biological neural network model, image analysis, etc., can solve the problems of unnatural enhancement results, limited generalization ability, oversaturation, etc., and achieve good noise suppression effect

Active Publication Date: 2022-07-19
SHANXI UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, a method trained on a particular dataset is likely to favor a certain range of luminances and scenes, and thus be limited in its ability to generalize
Although GANs can be trained with unmatched datasets, they can sometimes generate inappropriate colors, making the augmentation results unnatural or oversaturated
Furthermore, direct image-to-image translation is relatively more difficult to implement than augmentation strategies incorporating physical priors, especially when noise in low-light conditions is also taken into account.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-channel exposure fusion network model and method for low-light image enhancement
  • Two-channel exposure fusion network model and method for low-light image enhancement
  • Two-channel exposure fusion network model and method for low-light image enhancement

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0046] In order to solve the problems existing in the existing methods, this embodiment proposes a dual exposure fusion structure model for low-light image enhancement based on a convolutional neural network. The network model is inspired by the imitation of the human creative process, that is, good works can obtain empirical guidance (fusion) from multiple attempts (generation). This embodiment considers that this generation and fusion strategy can also be adopted for the enhancement of low-illumination images. Since brightness is one of the most important and most easily changed factors in the imaging process, the enhancement strategy in this embodiment is positioned on how to perform desired enhancement for images with various unknown illumination levels.

[0047] For this reason, in the image generation stage, the method proposed in this embodiment uses a two-branch structure, and uses different enhancement strategies to deal with the extremely low and low illumination con...

no. 2 example

[0177] This embodiment provides a dual exposure fusion method for low-illumination image enhancement, which can be implemented by an electronic device. The execution flow of this method is as follows Figure 4 shown, including the following steps:

[0178] S101, processing the low-illumination image to be enhanced using different preset enhancement strategies, so as to obtain a two-way enhancement result corresponding to the low-illumination image to be enhanced;

[0179] S102, performing weighted fusion of the two-way enhancement results corresponding to the low-illumination images to be enhanced obtained by adopting different enhancement strategies, so as to obtain enhanced images corresponding to the low-illumination images to be enhanced;

[0180] Wherein, in the above S101, the image to be enhanced is enhanced using the following formula:

[0181]

[0182] in, represents the enhanced image, represents the input image to be enhanced, represents element-wise dot ...

no. 3 example

[0191] This embodiment provides an electronic device, which includes a processor and a memory; wherein, at least one instruction is stored in the memory, and the instruction is loaded and executed by the processor to implement the method of the second embodiment.

[0192] The electronic device may vary greatly due to different configurations or performances, and may include one or more processors (central processing units, CPU) and one or more memories, wherein the memory stores at least one instruction, so The instructions described above are loaded by the processor and perform the following steps:

[0193] S101, processing the low-illumination image to be enhanced using different preset enhancement strategies, so as to obtain a two-way enhancement result corresponding to the low-illumination image to be enhanced;

[0194] S102, performing weighted fusion of the two-way enhancement results corresponding to the low-illumination images to be enhanced obtained by adopting differ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dual-channel exposure fusion network model and method for low-illumination image enhancement. The network model includes a first enhancement branch, a second enhancement branch and a fusion module; wherein, the first enhancement branch and the The second enhancement branch is used to process the low-illumination images to be enhanced respectively using different preset enhancement strategies; the fusion module includes an attention unit and a fine-tuning unit, and is used to obtain the first enhancement branch and the second enhancement branch. The enhancement results are weighted and fused to obtain an enhanced image. In addition, the present invention also provides a two-step denoising strategy to efficiently and adaptively suppress noise in the enhancement process. The model proposed in the present invention has a clear physical interpretation and is lightweight and efficient. The network model can be used for low-light enhancement tasks in various low-light environments, especially considering changes in the imaging environment and high real-time processing requirements.

Description

technical field [0001] The invention relates to the technical field of low-illumination image enhancement, in particular to a dual-channel exposure fusion network model and method for low-illumination image enhancement. Background technique [0002] Compared to well-lit imaging conditions, images captured in low light tend to have lower contrast and dynamic range with darker areas, unpredictable noise, and blurry details. Low-light photography typically occurs when the environment is relatively dark (eg, at night or with limited brightness), or may be caused by the photographer failing to properly adjust the imaging equipment (eg, inappropriate sensitivity, aperture size, or exposure time setting). Since the enhancement of low-light images is of great significance in low-level or high-level semantic image processing tasks such as nighttime imaging, surveillance and security equipment, driverless technology, etc., how to effectively enhance low-light images in recent years ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/00G06N3/04G06N3/08
CPCG06N3/08G06T2207/10016G06T2207/10024G06T2207/20081G06T2207/20084G06N3/045G06T5/70
Inventor 卢锟张丽红
Owner SHANXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products