Check patentability & draft patents in minutes with Patsnap Eureka AI!

Code generation method based on abstract syntax tree structure information enhancement

An abstract syntax tree and code generation technology, which is applied in the field of machine translation, can solve the problems of poor prediction effect of the model and no perception of the structure of the abstract syntax tree, so as to reduce the effect of prediction error

Pending Publication Date: 2022-04-22
XIAMEN UNIV
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In related technologies, the existing code generation method converts the input source language into the abstract syntax tree of the target code through the model under the framework of the neural network encoder-decoder. Compared with the sequence-to-sequence model, the model can be better The syntax of the code is guaranteed to be correct; however, the model usually only considers the information of the parent node and the previous predicted node when generating the current node of the abstract syntax tree. This method does not perceive the structure of the entire abstract syntax tree and strengthens the model decoding. The machine's perception of the abstract syntax tree, resulting in poor prediction performance of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Code generation method based on abstract syntax tree structure information enhancement
  • Code generation method based on abstract syntax tree structure information enhancement
  • Code generation method based on abstract syntax tree structure information enhancement

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0033] In order to better understand the above technical solutions, the following will describe exemplary embodiments of the present invention in more detail with reference to the accompanying drawings. Although exemplary embodiments of the present invention are shown in the drawings, it should be understood that the invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present invention and to fully c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a code generation method based on abstract syntax tree structure information enhancement and a medium wherein the method comprises: obtaining manually annotated code generation data wherein the code generation data comprises a natural language sentence and a corresponding target code; analyzing the target code by adopting an analyzer to obtain an abstract syntax tree corresponding to the target code so as to obtain a prediction sequence of the abstract syntax tree according to the abstract syntax tree, and taking the natural language sentence and the prediction sequence of the abstract syntax tree as training data; a code generation model is established, the code generation model is trained by using the training data, and the code generation model comprises an encoder, a decoder, a historical information enhancement module and a future information enhancement module; and inputting the obtained to-be-processed natural language sentence into the trained code generation model for conversion so as to generate a target code and an abstract syntax tree corresponding to the to-be-processed natural language sentence, thereby reducing the prediction error of the model.

Description

technical field [0001] The present invention relates to the technical field of machine translation, in particular to a method for generating codes based on abstract syntax tree structure information enhancement and a computer-readable storage medium. Background technique [0002] In related technologies, the existing code generation method converts the input source language into the abstract syntax tree of the target code through the model under the framework of the neural network encoder-decoder. Compared with the sequence-to-sequence model, the model can be better The syntax of the code is guaranteed to be correct; however, the model usually only considers the information of the parent node and the previous predicted node when generating the current node of the abstract syntax tree. This method does not perceive the structure of the entire abstract syntax tree and strengthens the model decoding. The machine's perception of the abstract syntax tree leads to poor prediction ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F8/41G06F40/58
CPCG06F8/447G06F40/58Y02D10/00
Inventor 苏劲松蒋辉曾华琳
Owner XIAMEN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More