Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data compression method and device based on stacking type self-coding and PSO algorithm

A data compression and self-encoding technology, which is applied in the field of data compression based on stacked self-encoding and PSO algorithm, can solve problems such as poor nonlinear mapping ability and slow convergence speed, and achieve improved training speed, improved accuracy and search speed, Avoid the effect of feature perturbation

Active Publication Date: 2018-03-02
ELECTRIC POWER RES INST OF GUANGDONG POWER GRID +1
View PDF10 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The invention provides a data compression method and device based on stacked self-encoding and PSO algorithm, which solves the problem of slow convergence speed, poor nonlinear mapping ability and the need for input data with labels when using shallow neural networks for data compression. technical issues

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data compression method and device based on stacking type self-coding and PSO algorithm
  • Data compression method and device based on stacking type self-coding and PSO algorithm
  • Data compression method and device based on stacking type self-coding and PSO algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The invention provides a data compression method and device based on stacked self-encoding and PSO algorithm, which solves the problem of slow convergence speed, poor nonlinear mapping ability and the need for input data with labels when using shallow neural networks for data compression. technical issues.

[0047] In order to make the purpose, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the following The described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0048] see figure 1 ,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a data compression method based on stacking type self-coding and a PSO algorithm. In the method, a stacking type self-coding model is a deep learning model, the training speed of a neural network can be improved by fine tuning of a layer-by-layer learning algorithm and an overall network weight, redundant information in data is removed, the most valuable information in the original data is extracted, and meanwhile the stacking type self-coding model is endowed with a good non-linear mapping capacity by a multi-layer network structure of the stacking type self-coding model; the stacking type self-coding model can be endowed with a weight applicable to expressing a sample via the fine tuning of the layer-by-layer learning algorithm and the overall network weight, the characteristics of the data are learned, the input data does not need to have a label, a network parameter of the stacking type self-coder is set via the PSO algorithm, the accuracyand searching speed are further improved, and the technical problems that when a shallow neural network is used for compressing the data at present, the convergence speed is slow, the non-linear mapping capability is poor and the input data needs to have the label are solved.

Description

technical field [0001] The invention relates to the field of data mining, in particular to a data compression method and device based on stacked self-encoding and PSO algorithm. Background technique [0002] The current data compression using neural network is through the shallow neural network model for data compression, and there are the following deficiencies in using the shallow neural network model for data compression: 1. Most of the current algorithms for training shallow neural network models are BP algorithm, BP algorithm The convergence speed is slow, it is easy to fall into local minimum points, and the shallow neural network model is easily disturbed by new features; 2. Due to the characteristics of the training algorithm, the number of layers of the shallow neural network model cannot be too many, and the nonlinear mapping ability is relatively weak. Poor, because the error will gradually decrease during the backpropagation process, too many layers will cause th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H03M7/30G06N3/08
CPCG06N3/08H03M7/3059
Inventor 卢世祥阙华坤林国营
Owner ELECTRIC POWER RES INST OF GUANGDONG POWER GRID
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products