Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Knowledge distillation method and device

A technology of distillation method and distillation device, which is applied in the field of knowledge distillation method and device, can solve the problems of difficult data modeling, large amount of calculation of ultra-deep network, and difficult deployment, etc., to achieve strong generalization, reduce performance gap, and improve performance effect

Active Publication Date: 2019-04-16
AISPEECH CO LTD
View PDF4 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] i-vector can be regarded as a single-layer linear model in essence. It is difficult to perform robust modeling for complex data, and the performance of short-term data is not very good.
[0007] For practical application scenarios, ultra-deep networks (such as residual networks) are difficult to deploy due to the huge amount of calculations, and simple shallow models with small parameters often fail to meet performance requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Knowledge distillation method and device
  • Knowledge distillation method and device
  • Knowledge distillation method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0021] Please refer to figure 1 , which shows a flow chart of an embodiment of the knowledge distillation method of the present application. The knowledge distillation method of this embodiment can be applied to the solution of using a large model to train a small model.

[0022] Such as figure 1 As shown, in step 101, in the speaker embedding l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a knowledge distillation method and device. The knowledge distillation method comprises the steps that when a speaker is in a learning task, audio data of the speaker is inputinto a teacher model and a student model, wherein the teacher model and the student model involve speaker embedding extraction and speaker posterior probability distribution prediction respectively; the speaker embedding extraction of the teacher model is taken as the standard, and the difference between the speaker embedding extraction of the student model and the speaker embedding extraction ofthe teacher model is limited in a first preset range to optimize the student model; or the speaker posterior probability distribution prediction of the teacher model is taken as the standard, and thedifference between the speaker posterior probability distribution prediction of the student model and the speaker posterior probability distribution prediction of the teacher model is limited in a second preset range to optimize the student model; the optimized student model is used for deployment and / or prediction. Therefore a small model can be deployed and used by training the small model through a large model with good performance.

Description

technical field [0001] The invention belongs to the technical field of speech data processing, and in particular relates to a knowledge distillation method and device. Background technique [0002] In related technologies, i-vector is a classic speaker embedding learning method, which is based on a traditional factor analysis model, and essentially obtains a low-dimensional space representation of a Gaussian supervector. [0003] Speaker embedding learning based on deep neural networks first trains a network whose goal is to distinguish different speakers, and then extracts speaker embedding representations from a specific layer (embedding extraction layer). A large number of papers prove that large networks and deep networks usually achieve better results. [0004] Deep speaker embedding learning is a very effective method for speaker identity modeling. Ultra-deep models such as residual networks have achieved good performance, but for real application scenarios with limit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G10L17/04G10L17/08G10L17/18G06N3/08
CPCG06N3/08G10L17/04G10L17/08G10L17/18
Inventor 俞凯钱彦旻王帅杨叶新
Owner AISPEECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products