A Memory Allocation Method for Neural Networks

A memory allocation and neural network technology, applied in the field of computer and artificial intelligence, can solve problems such as being unsuitable for use and spending labor time, and achieve the effects of complete automation, reducing memory size, and reducing memory fragmentation.

Active Publication Date: 2022-08-05
HANGZHOU NATCHIP SCI & TECH
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method can improve the memory utilization rate, but it will take a lot of manpower time, so it is not suitable for use in actual projects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Memory Allocation Method for Neural Networks
  • A Memory Allocation Method for Neural Networks
  • A Memory Allocation Method for Neural Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The technical solutions of the present invention will be further described below with reference to the accompanying drawings and embodiments. It should be noted that the accompanying drawings are for illustrative purposes only, and should not be construed as limiting the present patent. Meanwhile, the present invention may be implemented in various forms and should not be limited by the embodiments set forth herein. The following embodiments are provided so that the present invention may be more easily understood and will be more fully presented to those skilled in the art.

[0043] like figure 1 A neural network memory allocation method is shown, specifically:

[0044] S1. Obtain the calculation units in the calculation graph, and number each calculation unit in sequence according to the calculation sequence; the details are as follows:

[0045] S11. Traverse the neural network computation graph, remove the operation units in which the input tensor and the output te...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a memory allocation method of a neural network. The traditional dynamic memory allocation method has a lot of waste, and the manual memory allocation method requires more labor time. The method of the invention first obtains the calculation units in the calculation graph, and sequentially numbers each calculation unit according to the calculation sequence; then obtains the calculation number set of the memory reusable tensors of all the calculation units in the model; determines the final memory reusable tensors Memory allocation method, and obtain the total size of reusable memory required by the model and the allocated memory address of each memory reusable tensor. The method of the invention can effectively reduce the memory fragments generated when applying and releasing the memory of the neural network model, reduce the total memory size required by the neural network model, and can be conveniently used in practical engineering.

Description

technical field [0001] The invention belongs to the technical field of computers, in particular to the technical field of artificial intelligence, and particularly relates to a memory allocation method of a neural network. Background technique [0002] Artificial intelligence has developed rapidly in recent years, and deep learning and neural networks are the basis for the development of artificial intelligence. Since neural networks tend to have a large number of layers and a large tensor size, they will consume more memory. Also, the need to deploy neural networks on embedded devices has grown stronger in recent years. Therefore, the optimization of memory allocation is crucial. [0003] A memory optimization method is to use the traditional dynamic memory allocation method, such as the memory allocation of the malloc function in the C language standard library. However, this dynamic allocation method does not allocate memory from a more global perspective, which is pro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063G06F9/50G06F12/02
CPCG06N3/063G06F9/5016G06F12/0246
Inventor 郑迪任俊林刘祥有凌云
Owner HANGZHOU NATCHIP SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products