Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network training method and device

A neural network training and network technology, applied in the field of computer vision, can solve the problem that the student network cannot take into account the large application range and accuracy, and achieve the effects of wide application range, improved performance, and improved accuracy

Active Publication Date: 2017-10-13
BEIJING TUSEN ZHITU TECH CO LTD
View PDF3 Cites 106 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0023] The embodiment of the present invention provides a neural network training method and device to solve the technical problem that the student network trained by the existing knowledge transfer method cannot take into account the large scope of application and accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network training method and device
  • Neural network training method and device
  • Neural network training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] see figure 1 , which is a flowchart of a neural network training method in an embodiment of the present invention, the method comprising:

[0039] Step 101. Select a teacher network that realizes the same function as the student network.

[0040] Implemented functions such as image classification, target detection, image segmentation, etc. The teacher network has excellent performance and high accuracy, but compared with the student network, it has a complex structure, more parameter weights, and slower calculation speed. The student network has fast computing speed, average or poor performance, and simple network structure. A network that has the same function as the student network and has excellent performance can be selected as the teacher network from the set of preset neural network models.

[0041] Step 102: Iteratively train the student network to obtain a target network based on matching the distribution of the first intermediate layer features and the second ...

specific example

[0066] Concrete example is as follows: k ( , ) is the linear kernel function shown in following formula (7); Also or k ( , ) is the polynomial kernel function shown in following formula (8); Also or k ( , ) is the Gaussian kernel function shown in the following formula (9).

[0067] k(x,y)=x T y Formula (7)

[0068] k(x,y)=(x T y+c) d Formula (8)

[0069]

Embodiment 2

[0071] Based on the same concept of the neural network training method provided in the first embodiment, the second embodiment of the present invention also provides a neural network training device, the structure of which is as follows image 3 Described, comprise selection unit 31 and training unit 32, wherein:

[0072] The selecting unit 31 is configured to select a teacher network that realizes the same function as the student network.

[0073] Implemented functions such as image classification, target detection, image segmentation, etc. The teacher network has excellent performance and high accuracy, but compared with the student network, it has a complex structure, more parameter weights, and slower calculation speed. The student network has fast computing speed, average or poor performance, and simple network structure. A network that has the same function as the student network and has excellent performance can be selected as the teacher network from the set of prese...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network training method and device so as to solve a technical problem that a student network trained by the existing knowledge transfer mode cannot give consideration to a large application range and accuracy. The neural network training method comprises the steps of selecting a teacher network which realizes the same functions as the student network; and iteratively training the student network based on distribution of first intermediate layer features and second intermediate layer features corresponding to matching with the same training sample data so as to acquire a target network, and thus transferring knowledge of intermediate layer features of the teacher network to the student network, wherein the first intermediate layer features are a feature map outputted from a first specific network layer of the teacher network after inputting the training sample into the teacher network, and the second intermediate layer features are a feature map outputted from a second specific network of the student network after inputting the training sample data into the student network. The neural network acquired by adopting the technical scheme disclosed by the invention is not only wide in application range but also excellent in performance.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a neural network training method and device. Background technique [0002] In recent years, deep neural networks have achieved great success in various applications in the field of computer vision, such as image classification, object detection, image segmentation, etc. However, the deep neural network model often contains a large number of model parameters, with a large amount of calculation and slow processing speed, and it cannot be calculated in real time on some devices with low power consumption and low computing power (such as embedded devices, integrated devices, etc.). [0003] Based on the aforementioned problems, many deep neural network acceleration algorithms have been proposed recently, such as network pruning, network weight quantization, and network knowledge distillation. [0004] In addition to neuron-based network pruning, other network pruning methods and netwo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06N3/096G06N20/10G06N3/04
Inventor 王乃岩黄泽昊
Owner BEIJING TUSEN ZHITU TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products