Image fusion method based on fast BEMD and deep learning

A deep learning and image fusion technology, applied in the field of image processing, can solve the problems of low decomposition efficiency and large amount of calculation.

Active Publication Date: 2019-08-20
SOUTHEAST UNIV
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional BEMD algorithm has disadvantages such as large amount of calculation and low decomposition efficiency. The fast

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image fusion method based on fast BEMD and deep learning
  • Image fusion method based on fast BEMD and deep learning
  • Image fusion method based on fast BEMD and deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0043] The present invention provides an image fusion method based on fast BEMD and deep learning. The fast BEMD is used to perform multi-scale decomposition of two images to be fused, and multiple two-dimensional empirical mode decomposition components can be obtained. Then, the components of the same scale of the two images are fused using image fusion rules based on deep learning, and the fusion result map can be obtained by comprehensive BEMD reconstruction. By adopting this fusion method, a fusion result map satisfying the human visual system can be obtained, which is convenient for further image processing.

[0044] The following is based on the MATLAB 2017 tool on a PC (Intel(R) Core(TM) i5-7200U CPU 2.70GHz), taking the OCTEC image as an example, figure 1 It is a block diagram of the image fusion method based on fast BEMD and deep learn...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image fusion method based on fast BEMD and deep learning, and belongs to the technical field of image processing. The method comprises the following steps: carrying out multi-scale decomposition on a to-be-fused image by using a fast BEMD to obtain a two-dimensional empirical mode decomposition component (BEMC) with the frequency from high to low, and fusing the components respectively, and finally obtaining a fusion result graph through BEMD reconstruction. An image fusion rule based on deep learning is designed by utilizing the characteristic that deep learning canextract image features. Experiments show that the fusion result graph based on the fusion method has the optimal visual effect and meets the visual perception of human eyes.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to an image fusion method based on fast BEMD and deep learning. Background technique [0002] The purpose of image fusion is to use information analysis methods to process images of different time or space under specific conditions, synthesize complementary information of multi-source images, and eliminate redundant information, so as to obtain images that meet specific application scenarios. In recent years, the theoretical research on image fusion has developed rapidly. The methods of image fusion are mainly divided into six categories, namely multi-scale analysis method, neural network method, sparse representation method, subspace method, image saliency method and hybrid method. The image fusion method based on multi-scale analysis assumes that the image can be decomposed into multiple sub-images of different spatial scales, and the sub-images are fused separately. The...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T3/00G06T5/50
CPCG06T3/0068G06T5/50G06T2207/20221G06T2207/10048
Inventor 夏亦犁朱莹裴文江
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products