Compression method and system for multi-language BERT sequence labeling model
A sequence labeling, multi-language technology, applied in the field of knowledge distillation of BERT-like models, can solve the problem of not taking into account and unable to effectively fit the output of the teacher model.
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0095] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several changes and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.
[0096] The embodiment of the present invention provides a compression method for the multilingual BERT sequence labeling model, refer to figure 1 shown, including:
[0097] Step 1: Extract vocabulary from multilingual corpus based on Wordpiece algorithm;
[0098] Step 2: Pre-train multilingual BERT teacher model and multilingual BERT student model;
[0099] Step 3: Fine-tune the multi / single language BERT teacher model based on the downstream task data manually labeled;
[0100] Step 4: Use the multi...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com