Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for predicting prison term based on multi-task artificial neural network

A technology of artificial neural network and prediction method, which is applied in the field of sentence prediction based on multi-task artificial neural network, can solve the problems of large deviation of actual prediction results, ignoring useful information associations, ignoring relevant relationships, etc., and achieves the effect of improving accuracy.

Active Publication Date: 2019-02-22
SHANDONG UNIV
View PDF3 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the sentence prediction method based on the artificial neural network has realized the intelligent mining and utilization of judgment document information to a certain extent, the traditional sentence prediction method based on the artificial neural network directly predicts the sentence according to the description of the crime facts, and has not been fully utilized. The large amount of information contained in the judgment documents ignores the correlation between the sentence and various information in other dimensions. The sentence prediction method based on the single-task artificial neural network only takes the facts of the crime as the input and the sentence as the output, ignoring the fact in the judgment documents. The correlation between useful information in various dimensions, such as charges, basic information of suspects, and attribute information described by criminal facts, leads to problems such as poor convergence of model training and large deviations in actual prediction results, making it difficult to meet practical application requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for predicting prison term based on multi-task artificial neural network
  • A method for predicting prison term based on multi-task artificial neural network
  • A method for predicting prison term based on multi-task artificial neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0067] A method of sentence prediction based on multi-task artificial neural network, such as figure 1 shown, including the following steps:

[0068] (1) Preprocessing the raw data:

[0069] Extract the required information, realize data structuring, and construct structured data sets;

[0070] (2) Training stage:

[0071] Randomly divide the structured data set into two parts, with a ratio of 8:2. The large part of the data set is divided into N parts after being disrupted. Each time, N-1 parts are taken for training, 1 part for verification, and N times of cross-validation. Evaluate the performance of the model, and use a small part as the test data set; obtain the training data required for the current training stage, perform word segmentation processing and word vector mapping on the training data in turn, input the model, and obtain the output;

[0072] model such as figure 2 As shown, the model includes word vector embedding layer, bidirectional LSTM layer, maximum ...

Embodiment 2

[0086] According to a kind of sentence prediction method based on multi-task artificial neural network described in embodiment 1, its difference is:

[0087] In step (2), for the classification task of predicting crimes and intermediate attributes, the classification error in the form of cross-entropy is used to calculate the error between the output and the target; the cross-entropy calculation formula is shown in formula (I):

[0088]

[0089] In formula (I), y' i is the ith value in the label, y i is the corresponding predictive component, when the cross-entropy is smaller, the classification is more accurate. h y′ (y) refers to the cross entropy;

[0090] In step (2), for the sentence regression task, the mean square error is used to calculate the error, and the mean square error between the target and the actual sentence is calculated; the mean square error calculation formula is shown in formula (II):

[0091]

[0092] In formula (II), y' i is the ith value in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method for predicting prison term based on multi-task artificial neural network includes such steps as analyzing and mining the data of judgment document, predicting the term of imprisonment of newly obtained description of crime fact and basic information of suspect, predicting the term of imprisonment of newly obtained description of crime fact and basic information of suspect, predicting theterm of imprisonment of newly obtained description of crime fact, predicting the term of imprisonment of newly acquired description of crime, predicting the term of imprisonment of newly obtained description of crime and basic information of suspect, predicting the term of imprisonment of newly obtained description of crime, and predicting the term of imprisonment of newly obtained suspect. Guidedby the attributes of legal clauses cited in judgment documents, this method constructs a multi-task neural network model with mutual support effect by making full use of the multi-dimensional data injudgment documents, and trains the pre-processed judgment documents data to obtain a predictive method with high precision and strong practicability. This method takes 21 legal articles such as accusation, degree of injury and amount of money involved as auxiliary tasks, and classifies life sentence and death sentence separately, so as to achieve accurate prediction of sentence. Compared with thesingle-task neural network model based on such attributes, the prediction accuracy rate of the method proposed by the invention is higher, indicating that the method proposed by the invention is effective and practical.

Description

technical field [0001] The invention relates to a method for predicting a sentence based on a multi-task artificial neural network, which belongs to the technical field of natural language processing. Background technique [0002] The sentence prediction problem is one of the important basic problems in the information mining and analysis of judgment documents. Its goal is to predict the sentence that will be imposed on the basis of the crime facts based on the law and other relevant information based on the description of the crime facts. It can be used for subsequent automatic trials and legal intelligent consultation. In recent years, the development and application of natural language processing technology based on neural network has greatly promoted the development of referee document information processing and mining. The traditional sentence prediction method based on artificial neural network is based on the description of criminal facts and directly predicts the se...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/332G06F16/35G06N3/04G06Q50/18
CPCG06Q50/18G06N3/045
Inventor 李玉军冀先朋马浩洋韩均雷
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products