Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Few-sample learning classifier construction method based on unbalanced data

A technology of sample learning and construction methods, which is applied in the directions of instruments, calculations, character and pattern recognition, etc., can solve the problem that the target data cannot be processed in a unified binary classification, and achieve the effect of stable classification performance and good classification results

Inactive Publication Date: 2019-05-24
CHONGQING UNIV
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In view of this, the object of the present invention is to provide a method for constructing a few-sample learning classifier based on unbalanced data, which is used to solve the problem that unbalanced, high-dimensional and limited target data cannot be uniformly processed for binary classification, so that Achieve better classification results with unbalanced, high-dimensional and limited target datasets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Few-sample learning classifier construction method based on unbalanced data
  • Few-sample learning classifier construction method based on unbalanced data
  • Few-sample learning classifier construction method based on unbalanced data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0058] 1) Experimental data:

[0059]KEEL and NASA are open source machine learning data warehouses. The experiment randomly selected 14 data sets from these two machine learning warehouses for analysis. They are CM1, Appendicitis (Appe), Bupa, KC1, Ionosphere (Iono), Mammographic (Mamm), MW1, Phoneme (Phon), PC1, Ring, Sonar, Twonorm (Twon), Spambase (Spam) and Wisconsin (Wisc). They have different feature dimensions, the smallest feature attribute is 5, and the largest feature attribute is 60; their class imbalance ratios are also different, the smallest class imbalance ratio is 2, and the largest class imbalance ratio is 16. And each data set has a limited number of instances, the minimum number of instances is 106, and the maximum number of instances is 7400. Therefore, it is difficult for traditional machine learning classification algorithms to train effective data classification models from the above data sets.

[0060] 2) Comparison method:

[0061] The benchmark co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a few-sample learning classifier construction method based on unbalanced data, and belongs to the technical field of computer data classification. The method comprises the following steps: firstly, designing a twinning parallel full connection network for feature learning of input sample pairs according to primary learning and few sample learning characteristics of a twinning neural network; and then, processing the imbalance problem of the input sample pairs by using a cost sensitive optimizer, designing an expected error classification cost function according to different error classification costs, and integrating the expected error classification cost function into a network parameter optimization algorithm for adjusting class imbalance classification weights.According to the method, a better classification result can be obtained under unbalanced, high-dimensional and limited target data sets, and the classification performance is more stable.

Description

technical field [0001] The invention belongs to the technical field of computer data classification, and relates to a method for constructing a few-sample learning classifier based on unbalanced data. Background technique [0002] Data classification is one of the key research contents of data mining. It uses valuable available data to classify unknown data, and aims to explore the hidden relationship between variables and classes. Most of the current data-driven machine learning classification algorithms assume that the proportion of samples of each type of target data is the same, but in the actual binary classification task, the target data is often unbalanced, high-dimensional and limited, in this case , conventional machine learning classification algorithms are difficult to obtain good classification results. [0003] For class unbalanced data, Piri et al. in the article "S.Piri, D.Delen, T.Liu, A synthetic informative minority over-sampling (simo) algorithm leveragin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
Inventor 赵林畅尚赵伟赵灵龙祎萌任柏行
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products