Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-network combined auxiliary generative knowledge distillation method

A distillation method and generative technology, applied in the field of knowledge distillation, can solve problems such as overall performance decline, prediction error, simple sample forgetting, etc., and achieve good results and simple operation.

Pending Publication Date: 2022-05-27
HANGZHOU DIANZI UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] There is a problem with the above method of resisting knowledge distillation, that is, the generator randomly generates difficult samples and sends them to the student network for training. In the end, the student network may forget the simple samples, resulting in prediction errors and overall performance degradation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-network combined auxiliary generative knowledge distillation method
  • Multi-network combined auxiliary generative knowledge distillation method
  • Multi-network combined auxiliary generative knowledge distillation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] The present invention will be further described below in conjunction with the accompanying drawings.

[0060] A multi-network joint-assisted generative knowledge distillation method, see the specific steps figure 1 As shown, the overall architecture flow chart see figure 2 shown:

[0061] Step 1: Image classification dataset preprocessing;

[0062] Step 2: Select the teacher network model and train it;

[0063] Step 3: Select the difficult sample generator G1 and the student network to form the entire training framework;

[0064] Step 4: Establish the objective function of generating adversarial knowledge distillation;

[0065] Step 5: Iteratively train the established adversarial knowledge distillation framework;

[0066] Step 6: Introduce the easy sample generator G2, and use the hard sample generator G1 and the easy sample generator G2 to adjust the student network alternately.

[0067] Step 1. Data processing, the specific steps are as follows:

[0068] 1-1....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-network joint auxiliary generation type knowledge distillation method. The method comprises the following steps: firstly, preprocessing an image classification data set; selecting a teacher network model according to the determined image classification data set and training the teacher network model; selecting a difficult sample generator G1 and a student network according to the determined image classification data set to form an adversarial knowledge distillation framework; establishing a target function for generating confrontation knowledge distillation; carrying out iterative training on the built confrontation knowledge distillation framework; and finally, introducing a simple sample generator G2, and alternately adjusting the student network by using the difficult sample generator G1 and the simple sample generator G2 to obtain a final result. According to the method, the simple sample generator is additionally introduced, the simple sample generator directly copies the trained difficult sample generator, the calculation amount is not increased, and the operation is simple. Under the condition that the simple sample generator helps the students to review the simple samples through the network, a better effect is finally achieved on the target task.

Description

technical field [0001] The invention belongs to the field of knowledge distillation in the field of computer vision, and specifically proposes a multi-network joint auxiliary generative knowledge distillation method for image classification tasks. Background technique [0002] Convolutional Neural Network (CNN) has achieved remarkable achievements in image classification, segmentation, detection and other fields with its powerful feature extraction and expression capabilities. However, the structures of neural networks with high expressive ability are often complex and have a huge amount of parameters. In this case, deploying a complete CNN often requires huge memory overhead and high-performance computing units, and on embedded devices with limited computing resources and mobile terminals with high real-time requirements, the application of CNNs has limitations. Therefore, CNN urgently needs to be lightweight. [0003] Knowledge distillation is widely used as a model comp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06N3/045G06F18/24
Inventor 匡振中王一琳丁佳骏顾晓玲俞俊
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products