Multi-network combined auxiliary generative knowledge distillation method
A distillation method and generative technology, applied in the field of knowledge distillation, can solve problems such as overall performance decline, prediction error, simple sample forgetting, etc., and achieve good results and simple operation.
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0059] The present invention will be further described below in conjunction with the accompanying drawings.
[0060] A multi-network joint-assisted generative knowledge distillation method, see the specific steps figure 1 As shown, the overall architecture flow chart see figure 2 shown:
[0061] Step 1: Image classification dataset preprocessing;
[0062] Step 2: Select the teacher network model and train it;
[0063] Step 3: Select the difficult sample generator G1 and the student network to form the entire training framework;
[0064] Step 4: Establish the objective function of generating adversarial knowledge distillation;
[0065] Step 5: Iteratively train the established adversarial knowledge distillation framework;
[0066] Step 6: Introduce the easy sample generator G2, and use the hard sample generator G1 and the easy sample generator G2 to adjust the student network alternately.
[0067] Step 1. Data processing, the specific steps are as follows:
[0068] 1-1....
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com