Method for reducing training set of machine learning

A machine learning and training set technology, applied in the computer field, can solve the problems of training set reduction that cannot be applied to multi-class classification, reduced reduction effect, uncertain class distribution, etc., to improve overall work efficiency, achieve reduction, time complexity and Effects with low space complexity

Inactive Publication Date: 2016-03-09
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF0 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] (2) The reduction of the training set that cannot be applied to multi-class classification
It is not difficult to know from the text that this is a special reduction method for two-class classification. When reducing a class, this method relies on the assistance of other classes. When facing multi-class classification, because the distribution of classes is uncertain, the reduction effect will be drops considerably, and may not even be able to reduce any vector

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for reducing training set of machine learning
  • Method for reducing training set of machine learning
  • Method for reducing training set of machine learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] As the best implementation mode of the present invention, the present invention discloses a method for reducing machine learning training sets, the steps of which are as follows:

[0035] (1) Define the central formula of class A as , where S is the number of samples in class A, x i is the vector of samples in class A;

[0036] (2) Calculate the center point p of class A;

[0037] (3) Take a vector point x from class A, calculate the distance d from the vector point x to the center point p, if d is less than the screening factor λ, delete x from class A;

[0038] (4) Repeat step (3) to check all vector points in class A, if the number S of the remaining vector points in class A is less than the threshold α, proceed to step (6); if the remaining vector points in class A If the number S is greater than the threshold α, proceed to step (5);

[0039] (5) Repeat steps (2), (3) and (4), and proceed to step (6) after completion;

[0040] (6) Output the remaining vector p...

Embodiment 2

[0050] The overall design idea of ​​the experiment is to use the training set reduction method to reduce the general training set Newgroup, and then use the classic machine learning model SVM (support vector machine) model and CBC (centroid-based classification) model for testing.

[0051] The introduction of experimental materials and indicators, the training set Newgroup shares 20 categories, each category has 1000 samples, the entire training set has a total of 20000 samples, and the experimental indicators include micro-average F1 (micro_F1, the higher the value, the better), macro-average F1 ( macro_F1, the higher the value, the better), information entropy (Entropy, the lower the value, the better)

[0052] The following are the specific experimental steps.

[0053] 1. experiment process

[0054] (1) When the general training set is not reduced, 10% of the samples are randomly selected as the test set and the remaining 90% of the samples are used as the training set. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a method for reducing a training set of machine learning. The method comprises the following steps: (1) defining a central formula of a class A, wherein the central formula is as shown in the specification, S is the number of samples in class A, and Xi is vectors of the samples in class A; (2) calculating a central point p of class A; (3) taking out a vector point x from class A, calculating a distance d between the vector point x and the central point p, and if d is less than a filter factor lambda, deleting x from class A; (4) repeating step (3) to check all vector points in class A, and if the number S of remaining vector points is less than a threshold alpha, performing step (6), and if the number S of remaining vector points is larger than the threshold alpha, performing step (5); (5) repeating steps (2), (3) and (4), and then performing step (6); and (6) outputting the remaining vector points in class A as a new training set. According to the method provided by the present invention, the speed of machine learning is improved, the memory overhead is reduced, and the accuracy of classification is improved.

Description

technical field [0001] The invention relates to the field of computer technology, and relates to machine learning, in particular to a method for reducing training sets in machine learning, which can improve the speed of machine learning and reduce memory overhead. Background technique [0002] Machine learning is a multi-field interdisciplinary subject that has emerged in the past 20 years, involving probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory and other disciplines. Machine learning theory is mainly to design and analyze some algorithms that allow computers to "learn" automatically. The specific process of machine learning is to use algorithms to automatically analyze and obtain the laws of such data or information from a type of data or information (the data used for analysis is the training set), and use the obtained laws to predict unknown data. Therefore, machine learning can be applied in data mining to find valuab...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F19/00
CPCG16Z99/00
Inventor 刘川汪文勇王蒙
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products