Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image classification method based on integrated knowledge distillation

A classification method and knowledge technology, applied in the field of image classification based on integrated knowledge distillation, can solve problems such as performance gaps, and achieve the effects of good classification accuracy, soft output, and simple training process

Active Publication Date: 2021-01-08
ZHEJIANG UNIV
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As a result, there is still a large performance gap between the student model and the teacher model.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image classification method based on integrated knowledge distillation
  • Image classification method based on integrated knowledge distillation
  • Image classification method based on integrated knowledge distillation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be noted that the following embodiments are intended to facilitate the understanding of the present invention, but do not limit it in any way.

[0029] Such as figure 1 As shown, an image classification method based on integrated knowledge distillation includes the following steps:

[0030] S01, pre-trained teacher model

[0031] This embodiment uses the ImageNet-2012 data set as the image classification training data set. The ImageNet-2012 dataset contains 1.28 million training images, with a total of 1000 categories.

[0032] Before training, image transformation processing is performed on each picture. For details, please refer to "Deepresidual learning for image recognition" published on IEEE Conference on Computer Vision and Pattern Recognition, the top conference on computer vision. The stochastic gradient descent algorithm is u...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image classification method based on integrated knowledge distillation, and the method comprises the following steps: (1) pre-training a teacher model, dividing the trainingprocess of the teacher model into three stages, taking out the best teacher model from each stage, and obtaining three teacher models T1, T2 and T3; (2) training a student model, dividing the training process of the student model into three stages, and jointly guiding the student model by using the obtained three teacher models in each stage; wherein the weight of the T3 in each stage is kept unchanged; the T1 has the maximum weight in the first stage, and the T2 has the maximum weight in the second stage; and (3) carrying out a picture classification task by using the trained student model,inputting to-be-classified pictures, and carrying out classification prediction. By utilizing the method, knowledge learning of the student model from the teacher model becomes simple, so that the performance of the student model is further improved, and the image classification precision is ensured while the model response speed is increased.

Description

technical field [0001] The invention belongs to the technical field of image classification, and in particular relates to an image classification method based on integrated knowledge distillation. Background technique [0002] In the field of autonomous driving, the real-time performance of network models is a very important indicator. The model needs to classify and judge according to the pictures passed in by the camera, and then make driving decisions. This requires the model to respond quickly and obtain classification results in a short time. However, the high-performance models at this stage have a large number of parameters, and generally cannot respond in real time. This requires the use of model compression technology to compress large models to obtain smaller models without causing too much loss of accuracy. [0003] Knowledge distillation is an important model compression technique. When training a smaller model, the supervision information of a trained larger...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/55G06K9/62G06N3/02G06N3/08
CPCG06F16/55G06N3/02G06N3/08G06F18/214G06F18/24Y02T10/40
Inventor 杨柳蔡登王闻箫何晓飞
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products