Check patentability & draft patents in minutes with Patsnap Eureka AI!

Neural network caching method, system and device and storage medium

A neural network and cache technology, applied in the field of neural network algorithms, can solve the problems of increasing differences in the implementation and deployment of convolutional neural networks, lack of flexibility in various network dimensions and sizes, and the inability of cache to meet high-performance computing needs. , to reduce the hardware cache resource overhead, reduce the amount of data, and avoid congestion.

Pending Publication Date: 2022-07-12
SUN YAT SEN UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Networks of different dimensions need to allocate additional resources to calculate the difference in dimensions, resulting in a waste of computing resource distribution; networks of different sizes will make the cache that cannot be flexibly configured unable to meet the needs of high-performance computing and become the bottleneck of success
The implementation and deployment of convolutional neural networks on hardware platforms are increasingly differentiated, and hardware design lacks the flexibility to support multiple network dimensions and sizes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network caching method, system and device and storage medium
  • Neural network caching method, system and device and storage medium
  • Neural network caching method, system and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] This part will describe the specific embodiments of the present invention in detail, and the preferred embodiments of the present invention are shown in the accompanying drawings. Each technical feature and overall technical solution of the invention should not be construed as limiting the protection scope of the invention.

[0047] In the description of the embodiments of the present invention, the meaning of several is one or more, the meaning of multiple is two or more, greater than, less than, exceeding, etc. are understood as not including this number, above, below, within, etc. are understood as including this number, "At least one" refers to one or more, and "at least one of the following" and similar expressions refer to any combination of these items, including any combination of single or plural items. If there is a description of "first", "second", etc., it is only for the purpose of distinguishing technical features, and cannot be understood as indicating or...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a caching method, system and device of a neural network and a storage medium. The method comprises the following steps: acquiring configuration information including a neural network dimension; setting a working mode of the cache according to the configuration information; obtaining target to-be-processed data through the set cache; and according to the configuration information, processing the target to-be-processed data through the set cache. According to the method, the cache is configured to determine the target to-be-processed data, and different to-be-processed data are processed by adopting different data processing schemes according to the configuration information, so that cache mapping aiming at neural networks of different dimensions is realized, the cache can be prevented from being jammed, and the data processing efficiency is improved. Therefore, writing and outputting with high concurrency and high throughput rate are carried out. The method can be widely applied to the technical field of neural network algorithms.

Description

technical field [0001] The invention relates to the technical field of neural network algorithms, in particular to a neural network cache method, system, device and storage medium. Background technique [0002] There are differences in neural networks of different dimensions and sizes. Networks of different dimensions need to allocate additional resources due to differences in computing dimensions, resulting in a waste of computing resource distribution; networks of different dimensions will make the cache that cannot be configured flexibly unable to meet the needs of high-performance computing and become a bottleneck for success. The implementation and deployment of convolutional neural networks in hardware platforms are increasingly differentiated, and hardware design lacks the flexibility to support multiple network dimensions and sizes. [0003] To sum up, the problems existing in the related technologies need to be solved urgently. SUMMARY OF THE INVENTION [0004] ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06N3/04G06F12/0806
CPCG06N3/063G06F12/0806G06N3/045Y02D10/00
Inventor 王鉴虞志益邓慧鹏叶华锋肖山林
Owner SUN YAT SEN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More