Small-sample sentiment classification method based on knowledge distillation of big and small tutors
A sentiment classification and tutor technology, applied in the field of few-sample sentiment classification based on the knowledge distillation of large and small tutors, can solve problems such as practical application obstacles and slow reasoning speed, and achieve the effect of reducing resource consumption, improving accuracy, and reducing distillation time.
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0062] The present invention will be further described below with reference to the accompanying drawings and specific embodiments, so that those skilled in the art can better understand the present invention and implement it, but the embodiments are not intended to limit the present invention.
[0063] In the process of model optimization, a large model is often a single complex network or a collection of several networks, which has good performance and generalization ability; while a small model has limited expression ability because of the small network size. Therefore, the knowledge learned by the large model (teacher model) can be used to guide the training of the small model (student model), so that the small model has the same performance as the large model, but the number of parameters is greatly reduced, so as to achieve model compression and acceleration. This process called distillation.
[0064] and figure 2 Compared to the traditional single-teacher and single-st...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


