Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Implementation methods of neural network accelerator and neural network model

A neural network model and neural network technology, applied in biological neural network models, physical realization, etc., can solve problems such as easy data loss, poor flexibility, limited data storage bandwidth, etc., achieve low power consumption, improve performance, and reduce memory the effect of consumption

Inactive Publication Date: 2017-03-08
SHANGHAI XINCHU INTEGRATED CIRCUIT
View PDF4 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the problem that the neural network accelerator is implemented by software but has poor flexibility, and the data storage bandwidth is limited by hardware implementation, and data is easily lost after power failure, the present invention provides a method for implementing a neural network accelerator and a neural network model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Implementation methods of neural network accelerator and neural network model
  • Implementation methods of neural network accelerator and neural network model
  • Implementation methods of neural network accelerator and neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments, but not as a limitation of the present invention.

[0030] The invention proposes a method for realizing a neural network accelerator based on a nonvolatile memory. The non-volatile memory adopts a non-planar design, and stacks multi-layer data storage units in the vertical direction using the back-end manufacturing process (BEOL) to obtain higher storage density, accommodate higher storage capacity in a smaller space, and then Bring great cost savings and lower energy consumption, such as 3D NAND memory and 3D phase change memory. Below these 3D data storage arrays 1 are the peripheral logic circuits of the memories prepared by the front-end manufacturing process (FEOL). As the capacity of the memory chip continues to increase, the capacity of the data storage array 1 becomes larger and larger, but the area of ​​the corresponding peripheral log...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural network data processing method, and particularly relates to implementation methods of a neural network accelerator and a neural network model. The implementation method of a neural network accelerator is characterized in that the neural network accelerator comprises a nonvolatile memory, and the nonvolatile memory comprises a data storage array prepared in a back-end manufacturing process; and in a front-end manufacturing process for preparing the data storage array, a neural network accelerator circuit is prepared on a silicon substrate under the data storage array. According to the implementation method of a neural network model, the neural network model comprises an input signal, a connection weight signal, a bias, an activation function, an operation function and an output signal, wherein the activation function and the operation function are implemented through the neural network accelerator circuit, and the input signal, the connection weight, the bias and the output signal are stored in the data storage array. As the neural network accelerator circuit is implemented directly under the data storage array, the data storage bandwidth is not restricted, and data is still not lost after power failure.

Description

technical field [0001] The invention relates to a data processing method of a neural network, in particular to a method for realizing a neural network accelerator. [0002] The invention relates to a method for realizing a model, in particular to a method for realizing a neural network model. Background technique [0003] Artificial Neural Network (ANN) is a research hotspot in the field of artificial intelligence since the 1980s. It abstracts the human brain neuron network from the perspective of information processing, establishes a simple model, and forms different networks according to different connection methods. In engineering and academia, it is often referred to simply as neural network or neural network. A neural network is an operational model consisting of a large number of nodes (or neurons) connected to each other, such as figure 1 Shown is a schematic diagram of a neuron, each node represents a specific output function, called the activation function; every...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/06
CPCG06N3/06
Inventor 易敬军陈邦明王本艳
Owner SHANGHAI XINCHU INTEGRATED CIRCUIT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products