Compressed representation learning method based on tensor decomposition

A learning method and tensor decomposition technology, applied in the direction of neural learning methods, instruments, biological neural network models, etc., can solve the redundancy or deficiency of downstream tasks, it is difficult to migrate to specific supervision tasks, and stay in small data set reasoning tasks And other issues

Active Publication Date: 2020-06-26
ZHEJIANG LAB
View PDF5 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, both of the above approaches have their limitations
The representation learned by the first idea lacks a priori constraints. Even through the pre-training of massive data, it still shows redundancy or insufficiency for specific downstream tasks.
The second approach to...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Compressed representation learning method based on tensor decomposition
  • Compressed representation learning method based on tensor decomposition
  • Compressed representation learning method based on tensor decomposition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] Below in conjunction with accompanying drawing, further describe the present invention through embodiment, but do not limit the scope of the present invention in any way.

[0058] This invention shows how representation learning can be enhanced with differentiable programmable tools. To encode a prior about the representation itself, that good representations should be compact and cohesive, we model the representation directly. Through visualization, we found that the first idea in the background technology is based on the insufficiency of the representation learned through pre-training, which often manifests in two types: missing and redundant. Considering that the classic tensor decomposition model is often used for image completion and denoising, we hope that this type of model can solve the lack and redundancy of pre-trained representations, and then establish differentiable tensors on the pre-trained representations. Quantitative decomposition model. The inventio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a compressed representation learning method based on tensor decomposition. The method comprises the following steps that firstly, a representation learned through micro neuralnetwork preprocessing is converted into a tensor to be decomposed; tensor decomposition is carried out on the basis of an optimization algorithm, a subspace of the tensor decomposition is solved, low-rank reconstruction is carried out, and finally, low-rank representation extracted through processing tensor decomposition of another micro neural network is fused into representation learned by a backbone network to play a role of regularization; and a truncated single-step gradient optimization method is combined to improve an optimization algorithm with a multi-step time axis iterative model. According to the method, regularization and supplementation are successfully provided for large-scale pre-training and representation learning in a calculation-friendly and parameter-saving mode, the effectiveness of the method is verified by a large number of tasks and applications of computer vision, and a remarkable effect is achieved in image recognition, semantic segmentation and target detection; and the attention mechanism commonly used by computer vision is destroyed by lighter calculation and parameter quantity.

Description

technical field [0001] The invention belongs to the technical field of representation learning and deep neural network structure design, and in particular relates to a compressed representation learning method based on tensor decomposition. Background technique [0002] In recent years, representation learning has achieved great success in the field of machine learning. Representation learning extracts distributed representations from data and applies regularization to the representations to achieve disentanglement. Distributed representations can represent exponentially large-scale information with polynomial-level complexity. A disentangled vector representation, on the other hand, can separate the varying independent information factors from the data. Representation learning is general and proven to be beneficial for different downstream tasks. [0003] The concept of representation learning is quite broad. It is generally believed that transformations based on multi-...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/04G06N3/08G06K9/62
CPCG06N3/084G06N3/045G06F18/23213
Inventor 林宙辰耿正阳陈鸿旭陈鑫
Owner ZHEJIANG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products