Image out-of-focus deblurring method based on deep perception network

A depth perception and deblurring technology, applied in image enhancement, image analysis, image data processing, etc., to achieve the effect of low time complexity and high time complexity

Active Publication Date: 2021-06-22
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Most of the previous out-of-focus deblurring methods are based on numerical optimization or optimization of image prior constraints. This type of method can be used when out-of-focus blur is relatively simple.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image out-of-focus deblurring method based on deep perception network
  • Image out-of-focus deblurring method based on deep perception network
  • Image out-of-focus deblurring method based on deep perception network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The present invention will be described in further detail below in conjunction with examples and accompanying drawings, but the embodiments of the present invention are not limited thereto.

[0070] see Figure 1-3 , an image out-of-focus deblurring method based on a depth perception network, comprising the following steps:

[0071] S1, preprocessing the image to be deblurred; preprocessing includes: size cropping, random flipping and normalization.

[0072] S2, inputting the image to be deblurred into the trained depth perception network model to obtain the restored image;

[0073] S3, comparing the obtained restored image with the real clear image, and calculating the measurement index PSNR;

[0074] Wherein, the training of the depth perception network model includes the following steps:

[0075] (1) Obtain a database of out-of-focus blurred images; specifically, select and download a high-resolution out-of-focus blurred image dataset collected in a real scene. T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an image out-of-focus deblurring method based on a deep perception network. The method comprises the following steps: S1, preprocessing an image to be deblurred; S2, inputting a to-be-deblurred image into the trained depth perception network model to obtain a recovered image; S3, comparing the obtained restored image with a real clear image, and calculating a measurement index PSNR. According to the method, the auxiliary defocus blurring recovery of the depth image is effectively utilized, and the network is trained through enough data to learn degradation mapping, so an effective and rapid defocus deblurring image recovery network is obtained, and the method can be applied to image processing of a camera.

Description

technical field [0001] The invention belongs to the technical field of digital image processing, and in particular relates to an image out-of-focus deblurring method based on a depth perception network, which can restore a digital image with out-of-focus blur obtained by a camera into a clear digital image. Background technique [0002] In the information age with the popularization of the Internet and the rapid development of mobile intelligence, image information appears in all aspects of life. People can easily obtain digital images around them through mobile smart devices, digital cameras and other sensor devices, and disseminate and share them. With the development of 5G, the media for obtaining and sharing communication with others has gradually evolved from text information to digital images, and even multimedia information such as videos. Multimedia information such as digital images and videos can not only provide more colorful information and interaction, but also ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/00
CPCG06T5/003G06T2207/20081G06T2207/20084
Inventor 许勇祝叶
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products