Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-exposure image fusion method

An image fusion and multi-exposure technology, applied in the field of computer vision, can solve the problems of large image size and high time complexity

Inactive Publication Date: 2015-08-12
BEIJING UNION UNIVERSITY
View PDF5 Cites 58 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this framework, because the "sliding window technology" depends on the size of the image, the images acquired by general-purpose cameras are usually high-resolution and the image size is generally large, resulting in a high time complexity for the multi-exposure fusion algorithm based on this framework. limit its application

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-exposure image fusion method
  • Multi-exposure image fusion method
  • Multi-exposure image fusion method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] The invention adopts multi-exposure image sequences to construct learning samples, and uses a dictionary learning algorithm K-SVD to generate a dictionary matrix D. The sparsity enables the sparse representation to more accurately and adaptively select the most relevant atoms of the signal to be processed, and enhance the adaptive ability of the signal processing method, which is why the present invention uses a sparse matrix to represent image features.

[0015] In addition, the stimuli of the human visual system to image features exist on different scales. Based on this idea, an image fusion algorithm in the frequency domain is produced. In the fusion process, the image is decomposed into different frequency layers by using the multi-scale decomposition method, and the fusion process is carried out on each frequency layer separately. In this way, different fusion rules can be adopted for the characteristics and details of different frequency layers, thereby achieving ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a multi-exposure image fusion method. The Laplacian pyramid decomposition is utilized to perform multi-scale decomposition of an original image, so that a high frequency image and a low frequency image of the original image are obtained. Different fusion mechanisms are adopted for the high frequency image and the low frequency image to finally obtain a reconstructed image. In the Laplacian pyramid decomposition process, the size of the lower frequency image is much smaller than the size of the original image due to the processing process of down sampling of each layer, so that the time complexity of the sparse representation framework fusion method is greatly reduced, information of a specific frequency band can be highlighted, and more direction and texture information of the original image can be reserved.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a multi-exposure image fusion method. Background technique [0002] In recent years, the research direction of computational photography (Computational Photography) has emerged in the field of computer vision. The system's perception of the objective world. [0003] Computational photography is a highly interdisciplinary research field, involving technologies such as computer graphics, computer vision, image processing, visual perception, optics and traditional photography. Among them, the research on multi-exposure fusion has become a very important topic. The dynamic range (10 2 ) is much lower than the dynamic range of real scenes (10 10 ). In high dynamic range scenes, photos taken by digital cameras are either partially underexposed or partially overexposed, which will always cause some local detail information to be lost. A high dynamic range image can be obtained by pe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50
Inventor 王金华何宁
Owner BEIJING UNION UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products