Dynamic constraint adaptive method based on aggregation rule
A dynamically constrained, adaptive technique used in machine learning
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0010] Adaptive methods are susceptible to extreme learning rates. In order to effectively overcome this difficulty, the idea of ADABOUND algorithm is: by adding a threshold interval to the gradient, and the upper and lower thresholds change with time and finally converge to the learning rate of SGD, so as to realize the smooth transition from the adaptive method to SGD.
[0011] Construct aggregation rule Saddle(·) as s=Saddle({g i :i∈[m]})
[0012] For any dimension j, s j ∈ max m-q [(g i ) j +p] means to add noise perturbation p to each gradient value, and then select the largest m-q as the non-Byzantine gradient set. Because the nature of the saddle point is inherently unstable, when a slight disturbance is applied to a point at the saddle point, after a certain number of times, the point may slip from the saddle point, so as to achieve the purpose of escaping from the saddle point and finding a strict saddle function.
[0013] The specific steps are:
[0014] Ste...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More - R&D
- Intellectual Property
- Life Sciences
- Materials
- Tech Scout
- Unparalleled Data Quality
- Higher Quality Content
- 60% Fewer Hallucinations
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2025 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com



