Grayscale image fusion method

A fusion method and grayscale image technology, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of not using the fusion weight map, low contrast of low-light images, and low image quality

Active Publication Date: 2021-09-07
CHANGCHUN INST OF OPTICS FINE MECHANICS & PHYSICS CHINESE ACAD OF SCI
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Also, images captured in low-light conditions tend to have poor visibility
Lost image details are hard to recover due to limited dynamic range
Most of the existing multi-exposure fusion methods are aimed at color images. The original method of constructing the weight map cannot obtain the detailed information of the over-bright and over-dark areas, resulting in low quality of the fused image.
At the same time, the contrast of low-light images is low, and the fusion weight map is not used to construct the fusion weight map, resulting in uneven edge transition of the synthesized image and block effect in the image

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Grayscale image fusion method
  • Grayscale image fusion method
  • Grayscale image fusion method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0076] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention.

[0077] The purpose of the present invention is to provide a grayscale image fusion method. According to the characteristics of the decomposed image, different weight images are constructed and the Gaussian pyramid multi-scale space is established. By adopting the Laplacian fusion method, the global structure map and the local structure map are completed. fusion. Finally, add the fused global structure map and local structure map to obtain the final fused high dynamic image. A grayscale image fusion method provided by the present invention will be described in detail below through spec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a grayscale image fusion method. A global image and a local image are generated on a low-dynamic image by using a low-rank decomposition method, different weight maps of two decomposed images are constructed according to low-illumination imaging features, and the block effect after fusion is reduced, so that the visual effect of the image after fusion better conforms to the consistency of human eyes; the decomposed low-rank image and saliency image are combined in a Laplacian space by using a Gaussian pyramid weight factor to obtain a fused low-rank image and saliency image; and finally, a high-dynamic image is recreated by superposing the low-rank image and the saliency image. According to the method, detail features of the dark area and the highlight area of the image are kept, and the dynamic range of the image is widened.

Description

technical field [0001] The invention belongs to the technical field of high dynamic image synthesis, and in particular relates to a grayscale image fusion method based on low-rank decomposition. Background technique [0002] Common CCD sensors have a dynamic range of about 10 3 , much smaller than the dynamic range of real scenes. Additionally, images captured in low-light conditions tend to have poor visibility. Lost image details are difficult to recover due to limited dynamic range. Most of the existing multi-exposure fusion methods are aimed at color images, and the original method of constructing weight maps cannot obtain the detailed information of the over-bright and over-dark areas, resulting in low quality of the fused image. At the same time, low-illumination images have low contrast, and the fusion weight map is not constructed, resulting in uneven edge transition of the synthesized image and block effects in the image. Contents of the invention [0003] In ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/50G06T5/00G06T5/20
CPCG06T5/50G06T5/20G06T2207/20016G06T2207/20024G06T2207/20221G06T5/90G06T5/70
Inventor 聂婷黄良毕国玲李宪圣刘洪兴袁航飞付天骄
Owner CHANGCHUN INST OF OPTICS FINE MECHANICS & PHYSICS CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products