Underwater image enhancement method based on multi-residual joint learning

An underwater image and residual technology, applied in the field of deep learning, can solve problems such as not fully adapting to the underwater environment, and achieve the effect of improving serious color cast, reducing color cast, and improving quality

Pending Publication Date: 2021-01-29
HANGZHOU NORMAL UNIVERSITY
View PDF0 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It relies on assumptions and prior knowledge that are not fully adapted to the underwater environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Underwater image enhancement method based on multi-residual joint learning
  • Underwater image enhancement method based on multi-residual joint learning
  • Underwater image enhancement method based on multi-residual joint learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] In this embodiment, the underwater image enhancement method based on multi-residual joint learning includes the following steps:

[0034] S100, randomly cutting pictures of different resolutions in the underwater image data set including the degraded image and the corresponding reference image into images of the same resolution, and establishing a training set for the underwater image enhancement model;

[0035] S200, processing the cropped degraded images in the training set using multiple preprocessing methods, each preprocessing method correspondingly obtains a preprocessed image;

[0036] S300, using the reference image as the label of the degraded image, inputting the original image of the degraded image and the preprocessed degraded image into a multi-branch convolutional neural network of multi-residual joint learning for training to obtain an image enhancement model;

[0037] S400. Input the image to be enhanced into the image enhancement model to obtain a proce...

Embodiment 2

[0073] In this embodiment, except that the structure of the convolutional neural network is different from that of Embodiment 1, the rest are the same as Embodiment 1, and will not be repeated here.

[0074] In step S300 of this embodiment, firstly, a multi-residual joint learning generative adversarial network model is designed to enhance the underwater image and eliminate the blue-greenish phenomenon of the underwater image. The generator of the multi-residual joint learning generative adversarial network includes a convolutional network unit, a residual network unit, and a channel attention module.

[0075] The first branch: the input image is a cropped original image with a size of 256×256. Starting from the convolution stage, there is a first convolution unit and a second convolution unit for downsampling, which are used to learn the low-frequency information of the image; the first convolution unit and the second convolution unit are followed by several The residual gro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an underwater image enhancement method based on multi-residual joint learning, and belongs to the technical field of deep learning. The method comprises the following steps: 1) randomly cutting pictures with different resolutions in an underwater image data set containing a degraded image and a corresponding reference image into images with the same resolution, and establishing a training set of an underwater image enhancement model; 2) respectively processing the clipped degraded images in the training set by adopting a plurality of preprocessing methods, wherein eachpreprocessing method correspondingly obtains a preprocessed image; 3) taking the reference image as a label of a degraded image, and inputting the original image of the degraded image and the preprocessed degraded image into a multi-residual joint learning multi-branch convolutional neural network for training to obtain an image enhancement model; and 4) inputting a to-be-enhanced image into theimage enhancement model to obtain a processed enhanced image.

Description

technical field [0001] The invention relates to the technical field of deep learning, in particular to an underwater image enhancement method based on multi-residual joint learning. Background technique [0002] Underwater image enhancement technology has attracted much attention due to its significance in the fields of ocean engineering and aquatic robotics. Due to the complex underwater environment, the images captured by the camera underwater will have problems such as low contrast, color deviation, and blurred details. [0003] Due to the influence of many factors in the imaging process, such as the poor collimation of the light emitted by the auxiliary lighting source and the uneven distribution of light intensity in the shooting scene, the background brightness and dark distribution of the captured underwater images have large differences. In addition, due to the absorption and scattering effect of water on light, the light is strongly attenuated when it is transmitte...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/00G06T7/10G06N3/04
CPCG06T5/008G06T7/10G06T2207/20081G06T2207/20021G06N3/045
Inventor 丁丹丹陈龙
Owner HANGZHOU NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products