Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Native tensor processors and systems using native tensor processors

A processor and tensor technology, applied in electrical digital data processing, instruments, machine learning, etc., can solve the problems of expensive establishment of newer versions, outdated hardware, and expensive integrated circuits

Active Publication Date: 2019-07-12
广州异构智能科技有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, building custom integrated circuits is expensive, and once built, it is also expensive to build newer versions if the requirements of the application change
Custom hardware can also quickly become obsolete as technology advances

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Native tensor processors and systems using native tensor processors
  • Native tensor processors and systems using native tensor processors
  • Native tensor processors and systems using native tensor processors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The drawings and the following description refer to the preferred embodiment by way of illustration only. It should be understood from the following discussion that alternative embodiments of the structures and methods disclosed herein are readily considered as viable alternatives that may be employed without departing from the principles claimed.

[0039] Many deep learning, neural networks, convolutional neural networks, supervised machine learning, and other machine learning models use multilayer architectures with tensor processing between layers. figure 1 is a graph of a layer of a deep convolutional neural network. In this example, the previous layer provides p input feature maps (input planes), each with m input tiles per plane. The p input planes are filtered by a p×n filter bank, producing the next layer with n output feature maps (output planes), each with m output tiles per plane.

[0040] The processing between layers usually includes tensor contraction, o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A native tensor processor calculates tensor contractions using a sum of outer products. In one implementation, the native tensor processor preferably is implemented as a single integrated circuit andincludes an input buffer and a contraction engine. The input buffer buffers tensor elements retrieved from off-chip and transmits the elements to the contraction engine as needed. The contraction engine calculates the tensor contraction by executing calculations from equivalent matrix multiplications, as if the tensors were unfolded into matrices, but avoiding the overhead of expressly unfolding the tensors. The contraction engine includes a plurality of outer product units that calculate matrix mutiplications by a sum of outer products. By using outer products, the equivalent matrix multiplications can be partitioned into smaller matrix multiplications, each of which is localized with respect to which tensor elements are required.

Description

[0001] Cross References to Related Applications [0002] This application is a continuation of the US Patent Application No. 15 / 593192, filed on May 11, 2017, entitled "Native Tensor Processor, Using Outer Product Unit". The subject matter of the aforementioned applications is incorporated herein by reference in its entirety. technical field [0003] The present disclosure generally relates to tensor processing, including tensor contraction. Background technique [0004] As technology advances, more and more data is being created and analyzed every day. Machine learning techniques, such as deep learning and convolutional neural networks, are gaining in importance as important methods for analyzing these large amounts of data. However, the computational performance of such large-scale tasks has become increasingly dominated by the cost of moving data to the correct processing elements for computation. [0005] Conventional parallel processors have struggled to handle these...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N20/00G06F15/76
CPCG06F15/76G06F2015/761
Inventor 吕坚平邓宇轩
Owner 广州异构智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products