Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Node storage method and system based on neural network, server and storage medium

A neural network and storage system technology, applied in the field of node storage, can solve problems such as the rearrangement of node storage order, achieve the effect of improving storage space utilization and accelerating graph reasoning process

Active Publication Date: 2020-06-19
SHENZHEN CORERAIN TECH CO LTD
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there is a big difference in the data expression between the data flow architecture and the instruction set architecture: the operator granularity of the data flow architecture is much larger than that of the instruction set architecture; the operator of the data flow architecture needs to determine the order of the calculation modules in advance according to the data dependence before calculation
[0004] At present, there are still technical problems in the prior art about how to rearrange the order of node storage during hardware operation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Node storage method and system based on neural network, server and storage medium
  • Node storage method and system based on neural network, server and storage medium
  • Node storage method and system based on neural network, server and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0049] figure 1 It is a flow chart of a neural network-based node storage method provided by Embodiment 1 of the present invention. This embodiment is applicable to the situation where the ordering of nodes is optimized before performing neural network calculations on nodes. This method can be performed by a processor Execution, such as figure 1 As shown, a neural network-based node storage method, including:

[0050] Step S110, receiving a plurality of computing nodes with logical relationships sent by the off-chip memory;

[0051] Specifically, a memory is a memory device used to store information in modern information technology. Its concept is very broad and has many levels. In a digital system, as long as it can store binary data, it can be a memory; in an integrated circuit, a circuit with a storage function without a physical form is also called a memory, such as RAM, FIFO, etc.; In the system, storage devices in physical form are also called storage devices, such as m...

Embodiment 2

[0064] Embodiment 2 of the present invention is further optimized on the basis of Embodiment 1. figure 2 It is a flowchart of a neural network-based node storage method provided by Embodiment 2 of the present invention. Such as figure 2 As shown, the neural network-based node storage method of this embodiment includes:

[0065] Step S210, receiving a plurality of computing nodes with logical relationships sent by the off-chip memory;

[0066] Specifically, a memory is a memory device used to store information in modern information technology. Its concept is very broad and has many levels. In a digital system, as long as it can store binary data, it can be a memory; in an integrated circuit, a circuit with a storage function without a physical form is also called a memory, such as RAM, FIFO, etc.; In the system, storage devices in physical form are also called storage devices, such as memory sticks and TF cards. All information in the computer, including input raw data, c...

Embodiment 3

[0088] image 3 It is a schematic structural diagram of a neural network-based node storage system provided by Embodiment 3 of the present invention. Such as image 3 As shown, a neural network-based node storage system 300 includes:

[0089] The receiving module 310 is configured to receive a plurality of computing nodes with logical relationships sent by the off-chip memory;

[0090] Traversing module 320, configured to traverse the logical relationship between the plurality of computing nodes to confirm the first computing path and the second computing path, the first computing path and the second computing path include at least one same starting node ;

[0091] A first judging module 330, configured to judge whether the first calculation path and the second ground path have the same end node;

[0092] The first storage module 340 is configured to store all nodes in the first calculation path and the second calculation path in a first order if the same end node exists i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a node storage method and system based on a neural network, a server and a storage medium. The method comprises the following steps: receiving a plurality of computing nodes with logical relationships sent by an off-chip memory; traversing a logical relationship among the plurality of computing nodes to confirm a first computing path and a second computing path, wherein thefirst computing path and the second computing path at least comprise one same starting node; judging whether the first calculation path and the second grounding path have the same ending node or not;if the same ending node exists in the first calculation path and the second calculation path, storing all nodes in the first calculation path and the second calculation path according to a first sequence; and if the same ending node does not exist in the first calculation path and the second calculation path, storing all nodes in the first calculation path and the second calculation path according to a second sequence. By optimizing the node storage sequence, the technical effects of improving the storage space utilization rate and accelerating the graph reasoning process are achieved.

Description

technical field [0001] Embodiments of the present invention relate to node storage technology in deep learning, and in particular to a neural network-based node storage method, system, server, and storage medium. Background technique [0002] Deep learning networks are usually trained by algorithms. In most cases, algorithm developers tend to use existing public deep learning frameworks for model training, and most public deep learning frameworks are designed for computing devices such as CPU / GPU. CPU / GPU adopts traditional instruction set architecture, which has low architecture efficiency and high flexibility. [0003] With the development of deep learning related technologies, the requirements for computing power are getting higher and higher. The architectural efficiency defects of traditional instruction sets can no longer meet the needs of application scenarios. In contrast, the data flow architecture is more efficient, and it is more suitable for the development tr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/06G06N3/063
CPCG06F3/0635G06F3/0638G06F3/0611G06N3/063Y02D10/00
Inventor 马恺熊超蔡权雄牛昕宇
Owner SHENZHEN CORERAIN TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products