Natural image classification method combining self-knowledge distillation and unsupervised method

A technology of natural images and classification methods, applied in character and pattern recognition, instruments, biological neural network models, etc., can solve the problems of large number of parameters of deep learning models, inflexible deployment and use, etc.

Pending Publication Date: 2021-12-21
BEIJING UNIV OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Existing deep learning models have a large number of parameters and are inflexible in deployment and use. In order to solve this problem, the present invention uses self-knowledge distillation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Natural image classification method combining self-knowledge distillation and unsupervised method
  • Natural image classification method combining self-knowledge distillation and unsupervised method
  • Natural image classification method combining self-knowledge distillation and unsupervised method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] To make the objectives, technical solutions, and advantages of the present invention will become more apparent hereinafter in conjunction with specific embodiments, and with reference to the accompanying drawings, further details of the present invention.

[0029] S1: data portion

[0030] This embodiment uses the image data set Cifar100 classified training set. Cifar100 data set contains 60,000 training pictures, pictures of which 50,000 for the training set, 10 000 pictures for the test set, a total of 10 categories.

[0031] S1.1 using a simple random cuts, level inversion enhancement processing data

[0032] S1.2 Normalize the data operation. After randomization disrupted into different batches.

[0033] S2 training part

[0034] S2.1 model based on the depth of the deepest layer of the division as a network of teachers to supervise other branches with shallow knowledge of its output using distillation.

[0035] S2.2 designed joined together, the entire network is const...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a natural image classification method combining self-knowledge distillation and an unsupervised method. The unsupervised learning aims at discovering the characteristics of data, and after the characteristics of similar samples are extracted, the similar samples are similar in representation. An unsupervised mode is introduced into an existing self-knowledge distillation method, and the feature extraction capability of each branch can be improved, so that the classification accuracy of the model is improved. When the branch structure is designed, in order to further reduce the parameter quantity and improve the model inference speed, grouping convolution is adopted.

Description

Technical field [0001] The present invention relates to a neural network model compression, unsupervised, image classification field. In particular, the natural image classification method relates to a method of unsupervised from the encoder and from the knowledge of binding distillation process. Background technique [0002] In the huge amount of depth parameters of the neural network, not all parameters have played a role in the model, the limited role of some parameters, expressed redundancy, there may even reduce the performance of the model. It makes a huge amount of cost parameters become huge. Model compression technology is designed to obtain a relatively small amount of network parameters and large-scale, less resource-intensive, but in relatively good accuracy on small-scale networks. [0003] Appears convolution neural network, so that some tasks in computer vision and natural language processing, such as image classification, object detection, text classification perf...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/214G06F18/241
Inventor 杨新武刘伟
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products