Check patentability & draft patents in minutes with Patsnap Eureka AI!

Data reconstruction method based on auto-encoder

An auto-encoder and data reconstruction technology, applied in the field of data reconstruction based on auto-encoder, can solve the problem that auto-encoder is difficult to achieve lossless data reconstruction, and achieve the effect of improving the quality of data reconstruction

Pending Publication Date: 2021-11-12
YANGZHOU UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, at present, autoencoders are mainly used in the field of feature representation, and there are few researches and applications for data reconstruction. One of the reasons is that it is difficult for autoencoders to achieve lossless data reconstruction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data reconstruction method based on auto-encoder
  • Data reconstruction method based on auto-encoder
  • Data reconstruction method based on auto-encoder

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0017] A data reconstruction method based on an autoencoder. The autoencoder includes an encoding unit and a cascaded decoding unit; the sender uses the encoding unit to encode the original data, and the receiver uses the cascaded decoding unit to decode the data to achieve refactor.

[0018] The self-encoder based on cascaded decoding units (Cascade-Decoders) includes: encoding unit (Encoder), decoding unit 1 (Decoder 1), decoding unit 2 (Decoder 2), ..., decoding unit N (Decoder N). N decoding units are cascaded in the autoencoder.

[0019] Such as figure 1 As shown, when the general decoding unit is used, the autoencoder is expressed as:

[0020]

[0021] Among them, E represents the coding unit; D n Indicates the nth decoding unit; x is the input data of the self-encoder; z is the data output by the encoding unit, which is a low-dimensional representation in the latent space; N is the number of decoding units; y n-1 It is the output of the decoding units at all leve...

Embodiment 2

[0031] A data reconstruction method based on an autoencoder. The autoencoder includes an encoding unit and a cascaded decoding unit; the sender uses the encoding unit to encode the original data, and the receiver uses the cascaded decoding unit to decode the data to achieve refactor.

[0032] The self-encoder based on cascaded decoding units (Cascade-Decoders) includes: encoding unit (Encoder), decoding unit 1 (Decoder 1), decoding unit 2 (Decoder 2), ..., decoding unit N (DecoderN). N decoding units are cascaded in the autoencoder.

[0033] Such as figure 2 As shown, when the autoencoder adopts residual cascaded decoding units, the autoencoder is expressed as:

[0034]

[0035] Among them, E represents the coding unit; D n Indicates the nth decoding unit; x is the input data of the self-encoder; z is the data output by the encoding unit, which is a low-dimensional representation in the latent space; N is the number of decoding units; y n-1 It is the output of decoding...

Embodiment 3

[0045] A data reconstruction method based on an autoencoder. The autoencoder includes an encoding unit and a cascaded decoding unit; the sender uses the encoding unit to encode the original data, and the receiver uses the cascaded decoding unit to decode the data to achieve refactor.

[0046] The self-encoder based on cascaded decoding units (Cascade-Decoders) includes: encoding unit (Encoder), decoding unit 1 (Decoder 1), decoding unit 2 (Decoder 2), ..., decoding unit N (Decoder N). N decoding units are cascaded in the autoencoder.

[0047] Such as Figure 4 As shown, when the self-encoder adopts an adversarial cascade decoding unit, the self-encoder is expressed as:

[0048]

[0049] Among them, E represents the coding unit; D n Indicates the nth decoding unit; x is the input data of the self-encoder; z is the data output by the encoding unit, which is a low-dimensional representation in the latent space; N is the number of decoding units; y n-1 It is the output of the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data reconstruction method based on an auto-encoder, which comprises the steps of introducing a cascaded decoding unit into the auto-encoder, combining residual learning and adversarial learning, taking data reconstruction as a research target, independently evaluating data reconstruction capability, improving data reconstruction quality and gradually approaching to lossless data reconstruction, and providing a solid theoretical and application basis for data compression and signal compression sensing based on the auto-encoder.

Description

technical field [0001] The invention relates to a data reconstruction method based on an autoencoder. Background technique [0002] Autoencoders (AE) is a classic deep neural network architecture, which first maps high-dimensional data to low-dimensional latent variable space according to certain rules, and then reconstructs the original data from latent space variables and minimizes the recovery error . Autoencoders with sparse, convolutional, variational, adversarial, Wasserstein, graph structures, extreme learning, ensemble learning, reversible functions, recursive / recurrent, dual / dual, denoising, generative, fuzzy, non-negative, binary, quantum , linear, blind, group and kernel autoencoders and other theoretical models have been widely researched and applied in the fields of data classification, recognition, coding, perception and processing. Because deep autoencoders consist of encoding and decoding units, they have the potential to achieve high-performance data compr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/048G06N3/045
Inventor 李宏贵
Owner YANGZHOU UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More