Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Heterogeneous neural network knowledge recombination method based on common feature learning

A neural network and common feature technology, applied in the field of heterogeneous neural network knowledge reorganization, can solve infeasible problems, achieve the effect of learning robustness and saving labor costs

Inactive Publication Date: 2020-05-15
ZHEJIANG UNIV
View PDF1 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional knowledge distillation method is only for a single teacher model, and the goal is to compress the model, that is, use a small network model to imitate and learn the prediction results of a well-trained large network model. For details, see "Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, 2015.” Also, direct hierarchical feature fusion learning is infeasible since the teachers’ architectures may be different

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Heterogeneous neural network knowledge recombination method based on common feature learning
  • Heterogeneous neural network knowledge recombination method based on common feature learning
  • Heterogeneous neural network knowledge recombination method based on common feature learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The experimental method of the present invention will be described in detail below in conjunction with the accompanying drawings and implementation examples, so as to fully understand and implement the process of how to apply technical means to solve technical problems and achieve technical effects in the present invention. It should be noted that, as long as there is no conflict, each embodiment of the present invention and each feature in each embodiment can be combined or out of sequence, and the formed technical solutions are within the protection scope of the present invention.

[0031] The heterogeneous neural network knowledge reorganization method based on common feature learning provided by the present invention, its specific framework is as follows figure 1 As shown, suppose there are N teacher networks, and each teacher network uses T i Indicates that the method includes:

[0032]Step 1, under the same input, align the output features of the teacher model an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The heterogeneous neural network knowledge recombination method based on common feature learning comprises the steps of obtaining a plurality of pre-trained neural network models, and naming the neural network models as teacher models; guiding the training of the student model through a common feature learning and soft target distillation method by utilizing the features output by the teacher model and the output prediction result; in the common feature learning process, projecting features of a plurality of heterogeneous networks to a common feature interval, so that a student model integrates knowledge of a plurality of teacher models; the soft target distillation method enables the prediction result of the student model to be consistent with the prediction result of the teacher model, thereby obtaining a stronger student model with the task processing capability of all teacher models. Since only the prediction result of the teacher model needs to be simulated, the student model canbe obtained by training without any manual annotation. The method is suitable for knowledge reorganization of a neural network model, especially knowledge reorganization of a heterogeneous image classification task model.

Description

technical field [0001] The present invention relates to the field of machine learning, in particular to a knowledge reorganization method of heterogeneous neural networks based on common feature learning Background technique [0002] In recent years, deep neural networks (DNNs) have achieved impressive success in numerous artificial intelligence tasks, such as computer vision and natural language processing. However, despite the remarkable results, the training of DNN models is extremely dependent on large-scale human-annotated datasets, and its training takes a long time. In order to ease the reproduction work, more and more researchers have begun to publish the trained models on the Internet so that users can download and use them immediately. It will be very meaningful to reuse these published models to obtain customized models with multi-tasking capabilities without manual labeling of data. However, due to the rapid development of deep learning and the consequent emerg...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/253G06F18/214
Inventor 宋明黎罗思惠方共凡
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products