Self-adaptive normalization-based unsupervised attention generation network structure and method
A technology of attention and normalization, applied in biological neural network models, neural learning methods, neural architectures, etc., can solve problems such as limitations in the use of radiotherapy, reduce parameters and calculations, highlight importance, and reduce computational burden Effect
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0028] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.
[0029] An unsupervised attention generation network structure based on adaptive normalization, using Cycle-GAN as the basic architecture, and adding high and low frequency convolution layers, self-attention layers and adaptive normalization layers on this basis, In order to complete the image feature extraction and restoration, and convert the extracted features into corresponding images.
[0030] The basic model of the unsupervised attention generation network structure proposed by the present invention is Cycle-GAN, which contains a generator and a discriminator, and the generator contains an encoder and a decoder process. The structure of the present invention mainly improves the ...
PUM

Abstract
Description
Claims
Application Information

- R&D
- Intellectual Property
- Life Sciences
- Materials
- Tech Scout
- Unparalleled Data Quality
- Higher Quality Content
- 60% Fewer Hallucinations
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2025 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com