Unlock instant, AI-driven research and patent intelligence for your innovation.

Model acceleration method and apparatus based on knowledge distillation and nonparametric convolution

An acceleration device, non-parametric technology, applied in the field of deep learning, which can solve problems such as large-scale parameters and the limitation of the number of operations

Pending Publication Date: 2019-03-01
TSINGHUA UNIV
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Research progress in recent years has shown that the accuracy of convolutional neural networks can be improved by increasing the depth and width of the network. Despite the success of convolutional neural networks, the deployment of networks in real-life applications, especially on mobile devices or embedded On portable devices, they are mainly limited by their large-scale parameters and the number of calculations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model acceleration method and apparatus based on knowledge distillation and nonparametric convolution
  • Model acceleration method and apparatus based on knowledge distillation and nonparametric convolution
  • Model acceleration method and apparatus based on knowledge distillation and nonparametric convolution

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0043]The following describes the model acceleration method and device based on knowledge distillation and non-parametric convolution according to the embodiments of the present invention with reference to the accompanying drawings. First, the model based on knowledge distillation and non-parametric convolution proposed according to the embodiments of the present invention will be described with reference to the accompanying drawings Acceleration method.

[0044] figure 1 It is a flowchart of a model acceleration method based on...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a model acceleration method and a device based on knowledge distillation and nonparametric convolution, wherein, the method comprises the following steps: a cutting step of cutting a nonlinear layer of a convolution neural network and a convolution layer of aggregation redundancy; First distillation step of distilling the original model after cutting the convolution neuralnetwork to obtain an initial convolution neural network; replacing the remaining convolution layers in the first network with non-parametric convolution layers; a second distillation step of maintaining the accuracy of the replaced model by knowledge distillation to obtain a final convolution nerve. This method uses knowledge distillation to obtain lightweight nonparametric convolution, which reduces the model size and improves the runtime speed.

Description

technical field [0001] The invention relates to the field of deep learning technology, in particular to a model acceleration method and device based on knowledge distillation and non-parametric convolution. Background technique [0002] In recent years, convolutional neural networks have achieved breakthrough improvements in a large number of machine learning fields, such as image classification, object detection, semantic segmentation, and speech recognition. Research progress in recent years has shown that the accuracy of convolutional neural networks can be improved by increasing the depth and width of the network. Despite the success of convolutional neural networks, the deployment of networks in real-life applications, especially on mobile devices or embedded On portable devices, they are mainly limited by their large-scale parameters and calculation times. In order to solve this problem, some deep neural network compression algorithms are proposed to learn efficient c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04
CPCG06N3/045
Inventor 鲁继文周杰袁鑫任亮亮
Owner TSINGHUA UNIV