Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image defogging method based on global and local feature fusion

A local feature and feature map technology, applied in the field of image processing technology and deep learning, can solve the problems of artifacts and color distortion on the edge of objects, and achieve the effect of avoiding color distortion, no artifacts, and restoring image detail loss

Active Publication Date: 2019-12-06
TIANJIN UNIV
View PDF5 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For single image dehazing algorithms, most of the existing methods rely on simplified atmospheric light scattering models, and haze images in real scenes usually do not strictly follow the physical model and artificially set prior information. It is easy to introduce effects such as color distortion during the process. For example, the defogged image will have obvious halo phenomenon in the sky area, and artifacts will appear on the edge of the object in the case of dense fog.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image defogging method based on global and local feature fusion
  • Image defogging method based on global and local feature fusion
  • Image defogging method based on global and local feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] In order to achieve a realistic image defogging effect, an embodiment of the present invention proposes an image defogging method based on fusion of global and local features, see figure 1 , see the description below:

[0040] 101: Preprocessing the images in the training set;

[0041] 102: Construct a dehazing network based on the encoder-decoder architecture, set multiple densely connected units between the encoder and the decoder, and use the densely connected units to realize local and global fusion of feature maps;

[0042] 103: The feature map output by the encoder-decoder architecture passes through the subsequent convolutional neural network to obtain a dehazing map;

[0043] 104: Use the linear combination of L1 norm loss function, perceptual loss function and gradient loss function to train the dehazing network;

[0044] 105: After training, input a haze image to get a dehazed image.

[0045] Wherein, the specific steps of preprocessing the training set ima...

Embodiment 2

[0076] The scheme in Embodiment 1 is introduced in detail below in conjunction with specific drawings and calculation formulas, see the following description for details:

[0077] 201: Preprocessing the images in the training set;

[0078] 202: Construct a dehazing network based on the encoder-decoder architecture, set multiple densely connected units between the encoder and the decoder, and realize the local and global fusion of the feature map by the densely connected units;

[0079] 203: The feature map output by the encoder-decoder passes through the subsequent convolutional neural network to obtain a dehazing map;

[0080] 204: Use the linear combination of l1 norm loss function, perceptual loss function and gradient loss function to train the dehazing network;

[0081] 205: After training, input a haze image to obtain a dehazed image.

[0082] Wherein, the specific steps of preprocessing the training set images in step 201 are:

[0083] 1) The size of the pictures in ...

Embodiment 3

[0098] The scheme in embodiment 1 and 2 is carried out feasibility verification by experimental data below, see the following description for details:

[0099] Select 3 outdoor foggy images, use the defogging method of the present invention to remove the fog for these 3 foggy images, Figure 6 , Figure 7 and Figure 8 are the fogged image and the corresponding dehazed image, respectively.

[0100] It can be seen from the results of dehazing that the details covered by fog in the original image have been effectively restored, such as the details of windows in distant tall buildings become clearer after dehazing (such as Figure 6 shown); in addition, the brightness of the sky area in the defogged image changes naturally, and there are no halos, brightness, and contrast imbalances.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image defogging method based on global and local feature fusion. The image defogging method comprises the following steps: constructing a defogging network based on an encoder-decoder architecture, setting a plurality of dense connection units between an encoder and a decoder, and achieving the local and global fusion of a feature map through the dense connection units;enabling the feature map output by the encoder-decoder architecture to pass through a subsequent convolutional neural network to obtain a defogged map; training a defogging network by using a linear combination of the L1 norm loss function, the perception loss function and the gradient loss function; and after the training is finished, inputting a haze image to obtain a defogged image. According to the image defogging method, the defogged image can be directly obtained from one haze image without priori information of the image or estimation of the transmission rate.

Description

technical field [0001] The invention relates to the fields of image processing technology and deep learning technology, in particular to an image defogging method based on fusion of global and local features. Background technique [0002] In foggy weather, there are a large number of suspended particles in the air. Due to the influence of suspended particles, the atmospheric ambient light and the reflected light of the actual scene will be scattered and attenuated during the propagation process, which will affect the image quality. Cause image color distortion, contrast drop and other problems. By eliminating the influence of haze on the image, the dehazing algorithm can not only improve the subjective visual effect of the image, but also can be used as a preprocessing step for many computer vision tasks, such as automatic driving, target detection, and image classification, to improve the performance of the computer vision system. performance. Therefore, the image defoggi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00
CPCG06T2207/20081G06T2207/20084G06T2207/20221G06T5/77Y02T10/40
Inventor 李岳楠吴帅
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products