Supercharge Your Innovation With Domain-Expert AI Agents!

Probability rough set based decision tree generation method

A decision tree and rough set technology, applied in special data processing applications, instruments, electrical digital data processing, etc., to achieve the effect of solving data noise problems

Inactive Publication Date: 2010-09-01
TIANJIN UNIV
View PDF0 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0058] The technical problem to be solved by the present invention is to provide a decision tree generation method based on probability rough set that can effectively solve the problem of data noise based on probability rough set theory

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Probability rough set based decision tree generation method
  • Probability rough set based decision tree generation method
  • Probability rough set based decision tree generation method

Examples

Experimental program
Comparison scheme
Effect test

example

[0126] 1. Calculate the dependence of each condition attribute on the decision attribute, we use r(X, Y) to represent the dependence, then

[0127] The equivalence classes for each attribute are as follows

[0128] U / X 1 = {{1, 2, 3, 13, 14, 15, 16, 19, 20, 25}

[0129] {4, 5, 11, 12, 21, 22, 23} {6, 7, 8, 9, 10, 17, 18, 24}}

[0130] U / X 2 = {{1, 2, 3, 4, 5, 8, 10, 23, 25}

[0131] {6, 7, 13, 14, 17, 18, 19, 20, 21, 22, 24} {9, 11, 12, 15, 16}}

[0132] U / X 3 =

[0133] {{1, 2, 3, 4, 5, 6, 7, 13, 14, 21, 22, 24, 25} {8, 9, 10, 11, 12, 15, 16, 17, 18, 19, 20 ,twenty three}}

[0134] U / X 4 = {{1, 4, 6, 8, 13, 15, 17, 23, 25}

[0135] {3, 5, 7, 9, 12, 14, 16, 18, 19, 22} {2, 10, 11, 20, 21, 24}}

[0136] U / Y=

[0137] {{1, 2, 3, 6, 7, 8, 9, 10, 13, 14, 17, 18, 24, 25} {4, 5, 11, 12, 15, 16, 19, 20, 21, 22 ,twenty three}}

[0138] r(X 1 , Y)=0.6

[0139] r(X 2 , Y)=0

[0140] r(X 3 , Y)=0

[0141] r(X 3 , Y)=0

[0142] 2. Perform relative attribute reduction...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a probability rough set based decision tree generation method which comprises the steps of: (1) calculating the dependence degree of decision attribute to condition attribute; (2) reducing relative attribute of data to obtain a decision tree node set; (3) constructing a decision tree according to the decision tree node set in a way of taking nodes with the largest dependence degree as a root node, recalculating the dependence information of the rest of nodes of branches and selecting nodes with the largest dependence degree at the same time. The most central principle of the invention lies in the steps of getting rid of useless attributes by relative reduction to obtain nodes used for generating the decision tree, and then always selecting nodes with the largest dependence degree to expand and generate a decision tree to get the needed decision tree finally. The invention can effectively solve the problem of data noise.

Description

technical field [0001] The invention relates to a decision tree generating algorithm. In particular, it relates to a method for generating decision trees based on probability rough sets. Background technique [0002] Decision tree is a very intuitive knowledge representation method, and it is also an efficient classifier. At present, the well-known decision tree generation algorithm is the ID3 algorithm, C4.5, etc. proposed by Quinlan.J in 1986, which use the information entropy decline rate as the heuristic information to select nodes. However, how to generate smaller trees and prevent data overfitting, etc. problem has always been the focus of research. [0003] 1. Decision tree [0004] A decision tree is a predictive model; it represents a mapping relationship between object attributes and object values. Each node in the tree represents an object, and each bifurcated path represents a possible attribute value, and each leaf node corresponds to the object represented ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30
Inventor 刘江林利
Owner TIANJIN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More