Neural network acceleration method based on cross-resolution knowledge distillation

A neural network and resolution technology, applied in the field of deep learning, can solve problems such as unfavorable neural network applications, reduce the robustness and generalization ability of deep features, and do not consider the impact of the computational complexity of input images, and achieve low computational complexity. speed, increase the operation speed, and reduce the resolution

Active Publication Date: 2020-05-15
SUN YAT SEN UNIV
View PDF2 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The current knowledge distillation method only considers the structure of the compressed student network and the number of network parameters, and does not consider the impact of the resolution of the input image on the computational complexity
Obviously, reducing the resolution of the input image can significantly reduce the computational complexity of neural network extraction features, but at the same time it will also reduce the robustness and generalization ability of deep features, which is not conducive to the application of neural networks.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network acceleration method based on cross-resolution knowledge distillation
  • Neural network acceleration method based on cross-resolution knowledge distillation
  • Neural network acceleration method based on cross-resolution knowledge distillation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] Such as figure 1 As shown, this embodiment provides a neural network acceleration method based on cross-resolution knowledge distillation, which includes a high-resolution teacher network and a low-resolution student network, wherein the high-resolution teacher network is obtained from high-resolution training samples To learn and extract robust feature representations, the low-resolution student network quickly extracts deep features through low-resolution inputs, and extracts the prior knowledge of the high-resolution teacher network through a cross-resolution knowledge distillation loss to improve the discriminative ability of features.

[0043] In this embodiment, it is first necessary to obtain high-resolution training samples and low-resolution training samples by resampling according to the requirements of the application environment, and construct a high-resolution teacher network and a low-resolution student network, mainly including obtaining sample data, high-...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a neural network acceleration method based on cross-resolution knowledge distillation. The method comprises the steps of acquiring high-resolution and low-resolution training samples; constructing high-resolution and low-resolution student networks; pre-training a teacher network through the high-resolution sample data; fixing teacher network parameters, and extracting teacher network output from the high-resolution image; extracting low-resolution image features by using a student network, and constraining the output features of the high-resolution teacher network andthe low-resolution student network to be consistent through cross-resolution distillation loss; in the test stage, extracting robust features from a low-resolution input image by using a student network. According to the invention, knowledge propagation between high-resolution and low-resolution fields is realized by using cross-resolution distillation loss; by extracting the feature accelerationnetwork from the low-resolution image, the calculation complexity is reduced, the discrimination capability and generalization capability of the depth rule are improved by using the prior knowledge ofthe high-resolution image, and the excellent recognition performance is maintained while the calculation complexity of the depth network is greatly reduced.

Description

technical field [0001] The invention relates to the field of deep learning, in particular to a neural network acceleration method based on cross-resolution knowledge distillation. Background technique [0002] With the popularity of big data and the advancement of deep learning technology, deep networks have made great progress, and have made major breakthroughs in many research tasks such as face recognition, pedestrian re-identification, and object classification. However, the current technology faces the problems of high computational complexity and slow operation speed in the application scenarios, which makes many deep networks unable to meet the application requirements of real-time and resource-constrained scenarios. [0003] Aiming at the high computational complexity of neural networks, Hinton et al. proposed a knowledge distillation framework: learning to distinguish robust features through a deep network, and using the network as a teacher network. For the same i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045Y02D10/00
Inventor 冯展祥赖剑煌谢晓华
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products