Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Two-way exposure fusion network model and method for low-illumination image enhancement

A technology that integrates network and image enhancement. It is applied in image enhancement, biological neural network model, image analysis, etc. It can solve the problems of unnatural enhancement results, inappropriate colors, and difficulty in implementation, and achieve the effect of good noise suppression ability.

Active Publication Date: 2020-11-13
SHANXI UNIV
View PDF8 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, a method trained on a particular dataset is likely to favor a certain range of luminances and scenes, and thus be limited in its ability to generalize
Although GANs can be trained with unmatched datasets, they can sometimes generate inappropriate colors, making the augmentation results unnatural or oversaturated
Furthermore, direct image-to-image translation is relatively more difficult to implement than augmentation strategies incorporating physical priors, especially when noise in low-light conditions is also taken into account.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-way exposure fusion network model and method for low-illumination image enhancement
  • Two-way exposure fusion network model and method for low-illumination image enhancement
  • Two-way exposure fusion network model and method for low-illumination image enhancement

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0046] In order to solve the problems existing in the existing methods, this embodiment proposes a convolutional neural network-based two-way exposure fusion structural model for low-light image enhancement. The network model is inspired by the imitation of the human creative process, that is, good works can obtain empirical guidance (fusion) from multiple attempts (generation). In this embodiment, it is considered that the enhancement of low-illuminance images can also adopt this generation and fusion strategy. Since brightness is one of the most important and most likely to change factors in the imaging process, the enhancement strategy of this embodiment is positioned on how to carry out desired enhancement for images with various unknown illuminances.

[0047] For this reason, in the image generation stage, the method proposed in this embodiment uses a two-branch structure, and uses different enhancement strategies to deal with the two situations of extremely low and low i...

no. 2 example

[0177] This embodiment provides a two-way exposure fusion method for low-illuminance image enhancement, which can be implemented by electronic equipment. The execution flow of this method is as follows Figure 4 shown, including the following steps:

[0178] S101. Process the low-illuminance image to be enhanced using preset different enhancement strategies to obtain a two-way enhancement result corresponding to the low-illuminance image to be enhanced;

[0179] S102. Perform weighted fusion of two-way enhancement results corresponding to the low-illuminance image to be enhanced obtained by using different enhancement strategies, so as to obtain an enhanced image corresponding to the low-illuminance image to be enhanced;

[0180] Wherein, in the above S101, the image to be enhanced is enhanced by using the following formula:

[0181]

[0182] in, represents the enhanced image, Indicates the input image to be enhanced, represents element-wise dot product, represent...

no. 3 example

[0191] This embodiment provides an electronic device, which includes a processor and a memory; at least one instruction is stored in the memory, and the instruction is loaded and executed by the processor, so as to implement the method of the second embodiment.

[0192] The electronic device may have relatively large differences due to different configurations or performances, and may include one or more processors (central processing units, CPU) and one or more memories, wherein at least one instruction is stored in the memory, so The above instructions are loaded by the processor and perform the following steps:

[0193] S101. Process the low-illuminance image to be enhanced using preset different enhancement strategies to obtain a two-way enhancement result corresponding to the low-illuminance image to be enhanced;

[0194] S102. Perform weighted fusion of two-way enhancement results corresponding to the low-illuminance image to be enhanced obtained by using different enhan...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a double-channel exposure fusion network model and method for low-illumination image enhancement. The network model comprises a first enhancement branch, a second enhancement branch and a fusion module. Wherein the first enhancement branch and the second enhancement branch are used for processing a to-be-enhanced low-illumination image by using different preset enhancementstrategies respectively; and the fusion module comprises an attention unit and a fine adjustment unit and is used for carrying out weighted fusion on the enhancement results obtained by the first enhancement branch and the second enhancement branch to obtain an enhanced image. In addition, the invention also provides a denoising strategy divided into two steps, so as to carry out efficient adaptive suppression on the noise in the enhancement process. The model provided by the invention has clear physical explanation and is light and effective. The network model can be used for low-illuminationenhancement tasks in various low-light environments, especially in consideration of changes of imaging environments and high requirements for processing real-time performance.

Description

technical field [0001] The invention relates to the technical field of low-illuminance image enhancement, in particular to a two-way exposure fusion network model and method for low-illuminance image enhancement. Background technique [0002] Compared with imaging conditions with sufficient light, images captured in low light are often accompanied by darker areas, unpredictable noise and blurred details, with lower contrast and dynamic range. Low-light shots typically occur when the environment is relatively dark (e.g., at night or when brightness is limited), or it may be due to the photographer failing to properly adjust the imaging device (e.g., inappropriate ISO, aperture size, or exposure time setting). Since the enhancement of low-light images is of great significance in low-level or high-level semantic image processing tasks such as night imaging, monitoring and security equipment, and unmanned driving technology, how to effectively enhance low-light images has becom...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00G06N3/04G06N3/08
CPCG06N3/08G06T2207/10016G06T2207/10024G06T2207/20081G06T2207/20084G06N3/045G06T5/70
Inventor 卢锟张丽红
Owner SHANXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products