Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Session scene information extraction method

A technology for scene information and user information, applied in the field of conversation scene information extraction, can solve problems such as information interference, achieve the effect of improving the basic level, low maintenance iteration cost, and low complexity

Pending Publication Date: 2022-04-12
北京尘锋信息技术有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, in most marketing conversation scenarios, there is a lot of information interference and serious spoken words. For example, there will be many names in the conversation, only one or none of which is the customer’s information, or the questioner asks a question, but the customer denies it. How to distinguish Which information is the user's own and which is not, which questions are confirmed by the customer and which are not, these are the keys to automatically extract user information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Session scene information extraction method
  • Session scene information extraction method
  • Session scene information extraction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The technical solution of this patent will be further described in detail below in conjunction with specific embodiments.

[0049] Embodiments of the present patent are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are only used for explaining the patent, and should not be construed as limiting the patent.

[0050] refer to figure 1 , a conversation scene information extraction method, including BERT, BERT includes an utterance classification unit, an utterance processing unit, an utterance training unit, an utterance reprocessing unit, and an utterance prediction unit, and the utterance classification unit is used to classify utterances and information of both sides of the question and answer Category matching, the utteranc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of information artificial intelligence, and particularly relates to a session scene information extraction method, aiming at the problem of more interference information in a session text, the invention provides the following scheme: the session scene information extraction method comprises a BERT, and is characterized in that the BERT comprises an uttering classification unit, an uttering processing unit, an uttering training unit, an uttering reprocessing unit and an uttering prediction unit; the utters classification unit is used for carrying out classification and information category matching on utters of both parties of questioning and answering, the utters processing unit is used for adding proper marks, capturing semantic codes and formatting window data, and the utters training unit is used for classifying the utters through the codes. According to the method, the accuracy of the extracted information is ensured through a coarse-to-fine and coarse-to-fine combination method.

Description

technical field [0001] The invention relates to the technical field of information artificial intelligence, in particular to a method for extracting conversation scene information. Background technique [0002] In the conversation scenario, the conversation text contains a lot of information, among which the user's basic personal information and personal characteristic information are extremely important for building user portraits and promoting business. [0003] However, in most marketing conversation scenarios, there is a lot of information interference and serious spoken words. For example, there will be many names in the conversation, only one or none of which is the customer’s information, or the questioner asks a question, but the customer denies it. How to distinguish Which information is the user's own and which is not, which problems are confirmed by the customer and which are not, these are the keys to automatically extract user information. [0004] In response ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06F16/332G06F16/33G06F40/30G06F40/295
Inventor 赵继帆谭波
Owner 北京尘锋信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products