Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

End-to-end image defogging method based on multi-feature fusion

A multi-feature fusion and image technology, applied in image enhancement, image data processing, neural learning methods, etc., can solve the problems of model versatility and effectiveness, and achieve the effect of improving the performance of defogging

Pending Publication Date: 2022-07-12
NORTHWEST UNIV
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the defects or deficiencies in the above-mentioned prior art, the purpose of the present invention is to provide an image defogging method based on the fusion of prior features and depth features. problem, improving the defogging effect of the deep learning model in real scenes; and the model is relatively lightweight, which can achieve fast defogging

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • End-to-end image defogging method based on multi-feature fusion
  • End-to-end image defogging method based on multi-feature fusion
  • End-to-end image defogging method based on multi-feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0088] This embodiment provides an end-to-end image dehazing method based on multi-feature fusion, including the following steps:

[0089] Step 1, get the sample data set:

[0090] (1) Synthetic datasets

[0091] Get the dataset used by MSBDN after data augmentation on the RESIDE dataset. MSBDN selects 9000 outdoor foggy / clear image pairs and 7000 indoor foggy / clear image pairs as training sets from the RESIDE training dataset by removing redundant images from the same scene. And to further enhance the training data, each pair of images is resized using three random scales in the range [0.5, 1.0], 256×256 image patches are randomly cropped from the foggy image, and they are flipped horizontally and vertically to the model’s enter. The OTS sub-dataset in the RESIDE dataset is obtained as a test set, which contains 500 pairs of outdoor synthetic images.

[0092] (2) Real-world datasets

[0093] Get the O-HAZE dataset in the NTIRE2018 Dehazing Challenge and the NH-HAZE datas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an end-to-end image defogging method based on multi-feature fusion. The end-to-end image defogging method comprises the steps of 1, acquiring a sample data set; 2, building an end-to-end image defogging network model based on multi-feature fusion, wherein the model comprises a basic network taking a global feature fusion attention module as a core, a prior feature extraction module supporting back propagation and a prior feature adaptive fusion module; the dark channel priori features and the color attenuation priori features enter a priori feature adaptive fusion module to be fused, and then are fused with deep learning features obtained by the basic network; step 3, constructing a loss function; 4, training an end-to-end image defogging network model based on multi-feature fusion; and step 5, carrying out defogging processing on a to-be-processed image by using the trained model to obtain a defogged image. Experimental results of a synthetic data set and a real data set show that the defogging capability and migration capability of the model in a real scene are improved, the parameter quantity is small, and rapid defogging can be realized.

Description

technical field [0001] The invention belongs to the technical field of foggy image processing, and relates to an image defogging method based on fusion of multiple features. Background technique [0002] With the advent of the information age, various intelligent vision systems are widely used in the fields of intelligent transportation, intelligent security and military reconnaissance. They use the image as the basic carrier to transmit information and perform intelligent processing and analysis on it, such as target detection, recognition and tracking, etc., but these advanced vision tasks have certain requirements on the quality of the image. In the haze weather, due to the absorption and scattering of the reflected light and atmospheric light by a large number of suspended particles in the air, the quality of the captured image is greatly reduced, and there are problems such as reduced contrast, color distortion and reduced clarity. It will seriously affect the applicat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/00G06N3/04G06N3/08
CPCG06T5/003G06N3/084G06T2207/20081G06T2207/20084G06N3/047G06N3/048G06N3/045Y02A90/10
Inventor 罗杰卜起荣张蕾冯筠
Owner NORTHWEST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products