Image super-resolution method based on recursive attention mechanism
A super-resolution and attention technology, applied in the field of image processing, can solve problems such as proportional increase in memory, time-consuming super-resolution calculation of the number of network layers and channels, and harsh application conditions, so as to improve quality and reduce network The amount of parameters, the effect of good effect
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0048] Taking 900 images from the DIV2K image library and 100 images from the Urban100 image library as an example, the image super-resolution method based on the recursive attention mechanism of the present embodiment consists of the following steps (see figure 1 ):
[0049] (1) Image preprocessing
[0050] Select 800 images from the DIV2K image library as the training set, 100 images as the verification set, and 100 images from the Urban100 image library as the test set, which are down-sampled by bicubic interpolation and scaled by 2 times, as corresponding to the label image. low resolution image.
[0051] (2) Build a super-resolution network model
[0052] exist figure 2 , 3 , 4, the super-resolution network model of this embodiment is composed of a recursive attention network module and a reconstruction network module connected in series, and the recursive attention network model block is composed of a preprocessing convolution layer 4 and the first attention sub-mod...
Embodiment 2
[0076] Taking 900 images from the DIV2K image library and 100 images from the Urban100 image library as an example, the image super-resolution method based on the recursive attention mechanism of the present embodiment consists of the following steps:
[0077] (1) Image preprocessing
[0078] This step is the same as in Example 1.
[0079] (2) Build a super-resolution network model
[0080] The super-resolution network model of this embodiment is composed of a recursive attention network module and a reconstruction network module connected in series, and the structure and construction method of the recursive attention network module are the same as those in Embodiment 1.
[0081] The reconstruction network module of this embodiment is composed of a backbone network unit 5, an upsampler 6, and a post-processing convolutional layer in series. The input of the backbone network unit 5 is connected to the output of the third attention sub-module 3, and the output is connected to t...
Embodiment 3
[0085] Taking 900 images from the DIV2K image library and 100 images from the Urban100 image library as an example, the image super-resolution method based on the recursive attention mechanism of the present embodiment consists of the following steps:
[0086] (1) Image preprocessing
[0087] This step is the same as in Example 1.
[0088] (2) Build a super-resolution network model
[0089] The super-resolution network model of this embodiment is composed of a recursive attention network module and a reconstruction network module connected in series, and the structure and construction method of the recursive attention network module are the same as those in Embodiment 1.
[0090] The reconstruction network module of this embodiment is composed of a backbone network unit 5, an upsampler 6, and a post-processing convolutional layer in series. The input of the backbone network unit 5 is connected to the output of the third attention sub-module 3, and the output is connected to t...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More - R&D
- Intellectual Property
- Life Sciences
- Materials
- Tech Scout
- Unparalleled Data Quality
- Higher Quality Content
- 60% Fewer Hallucinations
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2025 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com



