Unlock instant, AI-driven research and patent intelligence for your innovation.

Training method, device, storage medium and electronic equipment of random forest

A random forest and training data technology, applied in the field of machine learning, can solve the problems of reduced classification prediction accuracy and low prediction accuracy, and achieve the effect of improving accuracy

Active Publication Date: 2021-08-13
NEUSOFT CORP
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the prediction accuracy rate of the decision tree that only undergoes a single training is not high, and it cannot cope with the unbalanced data characteristics in the training data during the training process (there is a lot of data in a certain category), which in turn reduces the accuracy of the entire classification prediction. question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method, device, storage medium and electronic equipment of random forest
  • Training method, device, storage medium and electronic equipment of random forest
  • Training method, device, storage medium and electronic equipment of random forest

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatuses and methods consistent with aspects of the present disclosure as recited in the appended claims.

[0065] figure 1 It is a flow chart of a random forest training method shown according to an exemplary embodiment, such as figure 1 As shown, the method includes:

[0066] Step 101, determine n groups of training data sets in the first training data.

[0067] Wherein, the first training data (also referred to as full training data) includes descriptive data corresponding to similar e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present disclosure relates to a random forest training method, device, storage medium and electronic equipment. The method includes: determining n groups of training data sets in the first training data; n trees, obtain n prediction results; delete n trees according to the correct rate of n prediction results and the preset threshold value, and obtain m trees; Vote to obtain the target tree; synthesize the prediction result corresponding to the target tree and the description data into the second training data; use the second training data as the first training data, and execute the above steps in a loop until the number of n prediction results If the correct rate is greater than or equal to the preset threshold, a random forest is obtained. It can continuously optimize the overall training data during the multiple training process of the random forest, and improve the accuracy of classification prediction while avoiding the increase of trees with a single feature during the training process.

Description

technical field [0001] The present disclosure relates to the field of machine learning, in particular, to a random forest training method, device, storage medium and electronic equipment. Background technique [0002] A random forest is a classifier that contains multiple decision trees, and its output prediction is determined by the mode of the prediction output by each tree. The decision tree is a tree-structured model for supervised learning. In supervised learning, a set of samples can be given first, each sample contains a set of attributes (descriptive data) and a category (prediction result), these categories are determined in advance, by learning this set of samples can get a The decision tree of the classification function, which can give the correct classification (output prediction result) to the new object. In related technologies, when training a random forest, each decision tree in the random forest is usually trained once with a part of the full training dat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N20/00G06N3/00
Inventor 高睿
Owner NEUSOFT CORP
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More