Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Classifier integration method based on floating classification threshold

An integrated method and classification threshold technology, which is applied in the fields of instruments, special data processing applications, electrical digital data processing, etc., can solve the problem of unstable classification of points near the classification boundary, and achieve the effect of good classification boundary.

Active Publication Date: 2014-04-23
CAS OF CHENGDU INFORMATION TECH CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] For the problems existing in the prior art, the main purpose of the present invention is to provide a classifier integration method based on a floating classification threshold that can overcome the problem of unstable classification of points near the classification boundary by a fixed classification threshold classifier

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Classifier integration method based on floating classification threshold
  • Classifier integration method based on floating classification threshold
  • Classifier integration method based on floating classification threshold

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] Combine below figure 1 , figure 2 , image 3 The specific process steps of the floating classification threshold-based classifier integration method for the binary classification problem of the present invention are described in detail.

[0041] When using the existing continuous AdaBoost algorithm, set the training sample set S={(x 1 ,y 1 ), (x 2 ,y 2 ), .., (x m ,y m )}, y i ∈{-1,+1}, i=1,...,m, where x i Represents the specific value of the i-th sample, y i represents the category of the i-th sample. (y i ,y i )∈S is simply written as x i ∈ S. Perform an n on the sample space S t Section division: When i≠j, Weak classifier h t (x) actually corresponds to an n of the sample space tSegment division, when the target is in the division segment , according to the probability of occurrence of samples of class 1 and class -1 in the segment and Weak classifier h t (x) will output Obviously, the output value of the weak classifier is the same fo...

Embodiment 2

[0057] Combine below figure 1 , Figure 4 , Figure 5 The specific flow steps of the classifier integration method based on floating classification thresholds for multi-classification problems in the present invention are described in detail.

[0058] In the binary classification problem, 1 and -1 are used to represent two types of labels. Therefore, the output value of the weak classifier ht(x) is directly the difference between the two types of label confidence, that is hour, Combined classifiers then output classes based on the sign of the cumulative confidence difference. In multi-classification problems, each weak classifier can only output the confidence of the corresponding category label, and the combined classifier is to accumulate the confidence of the same label, and finally output the corresponding label with the largest cumulative confidence. Remember h t (x, l) is h t (x) Confidence of the output label l (=1,...,K), combined classifier in

[0059] F...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a classifier integration method based on floating classification threshold, which is characterized by obtaining T optimal weak classifiers are by means of training after T iterations and then combining the T optimal weak classifiers to obtain an optimal combined classifier. In case of aiming at a bi-classification problem, training the T optimal weak classifiers comprises the steps of: (3.1) training the weak classifiers based on a training sample set S with weight omega<t>, wherein t is equal to 1,..., T; (3.2) based on the result of the step (3.1), adjusting sample weights omega<t+1>=omega<t>exp(-yiht(xi)) / Zt; (3.3) judging whether t is smaller than T, if so, enabling t to be equal to t + 1 and returning to the step (3.1) until t is equal to T; in case of aiming at multi-classification problem, training the T optimal weak classifiers comprises the steps of: (3.1) training the weak classifiers based on the training sample set S with weight omega<t>, wherein t is equal to 1,..., T; (3.2) based on the result of the step (3.1), adjusting sample weights shown in the description; (3.3) judging whether t is smaller than T, if so, enabling t to be equal to t + 1 and returning to the step (3.1) until t is equal to T. Compared with the prior art, the classifier integration method of the invention can overcome the defect that fixed classification threshold-based classifiers have unstable classification at points adjacent to classification boundary.

Description

technical field [0001] The invention belongs to machine learning and pattern recognition methods, in particular to a classifier integration method based on a floating classification threshold to improve the performance of the classifier. Background technique [0002] Improving classification accuracy through the combination of multiple classifiers has always been the main content of ensemble learning research, and the weak learning theorem strongly supports the feasibility of this research idea. Among them, AdaBoost (adaptive boosting, adaptive enhancement algorithm) and continuous AdaBoost algorithm based on the idea of ​​Boosting are currently one of the most researched and applied integrated learning algorithms, and their good performance and easy-to-use characteristics have attracted a large number of researchers. It is improved and perfected. Liu Dayou and others proposed a multi-classifier integration method based on incremental naive Bayesian network in patent CN1012...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F17/30G06K9/62
Inventor 付忠良赵向辉姚宇张丹普
Owner CAS OF CHENGDU INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products