Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Text processing model training method and device, text processing method and device and storage medium

A technology for text processing and training methods, applied in the field of information processing, can solve the problems that affect the generation of dynamic commodity advertisements, the difficulty of text processing models to produce high-quality text processing results, and the weak performance, so as to improve accuracy and readability, Improve training accuracy and training speed, and avoid the effect of impact

Pending Publication Date: 2020-04-03
TENCENT TECH (SHENZHEN) CO LTD
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In related technologies, in the process of generating dynamic product advertisements, the main text processing methods for realizing text content compression from long text to short text include: RNN-based generative processing, and the other is a combination of RNN-based generative and extractive methods However, in the above two processing methods, RNN is used as a semantic feature and comprehensive feature extractor, and its performance is weak. Therefore, the limitation of RNN capabilities makes it difficult for the text processing model to produce high-quality text processing results, which in turn affects the dynamic product ad generation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text processing model training method and device, text processing method and device and storage medium
  • Text processing model training method and device, text processing method and device and storage medium
  • Text processing model training method and device, text processing method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0088] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with the accompanying drawings, and the described embodiments should not be considered as limiting the present invention, and those of ordinary skill in the art do not make any All other embodiments obtained under the premise of creative labor belong to the protection scope of the present invention.

[0089] In the following description, references to "some embodiments" describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or a different subset of all possible embodiments, and Can be combined with each other without conflict.

[0090] Before further describing the embodiments of the present invention in detail, the nouns and terms involved in the embodiments of the present invention are described, and the nouns and terms involved in the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a text processing model training method. The method comprises the steps of obtaining a first training sample set; denoising the first training sample set to form a correspondingsecond training sample set; processing the second training sample set through a text processing model to determine initial parameters of the text processing model; responding to the initial parameters of the text processing model, processing the second training sample set through the text processing model, and determining updating parameters of the text processing model; and iteratively updatingencoder parameters and decoder parameters of the text processing model through the second training sample set according to the updating parameters of the text processing model. The invention further provides a text processing method and device and a storage medium. According to the invention, the generalization ability of the text processing model is stronger, the training precision and training speed of the text processing model are improved, and the accuracy and readability of text generation are improved.

Description

technical field [0001] The invention relates to information processing technology, in particular to a text processing model training method, text processing method, device and storage medium. Background technique [0002] In related technologies, in the process of generating dynamic commodity advertisements, the main text processing methods for realizing text content compression from long text to short text include: RNN-based generative processing, and the other is a combination of RNN-based generative and extractive methods However, in the above two processing methods, RNN is used as a semantic feature and comprehensive feature extractor, and its performance is weak. Therefore, the limitation of RNN capabilities makes it difficult for the text processing model to produce high-quality text processing results, which in turn affects the dynamic Product ad generation. Contents of the invention [0003] In view of this, an embodiment of the present invention provides a text p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/126G06F40/289G06K9/62
CPCG06F18/214
Inventor 李少波
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products