Memory management method and device in neural network forward calculation process

A forward computing and neural network technology, applied in the computer field, can solve problems such as crowding out other business memory usage, excessive memory, and affecting other business operations, and achieve the effects of reducing the number of allocations, realizing multiplexing, and improving memory usage efficiency

Active Publication Date: 2018-11-16
ZHEJIANG DAHUA TECH CO LTD
View PDF10 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the memory management method of the existing neural network forward calculation process, how many layers are divided internally, how many blocks of memory will be allocated, the more network layers are divided, the more memory blocks will be allocated, and the memory occupied will also be smaller. The more, this will inevitably crowd out the memory usage of other businesses, affect the progress of other businesses, and then reduce the efficiency of memory usage in the entire system
[0004] Therefore, the existing memory management method has the technical problem of low memory usage efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory management method and device in neural network forward calculation process
  • Memory management method and device in neural network forward calculation process
  • Memory management method and device in neural network forward calculation process

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051]In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are the embodiment of the present invention Some embodiments of the technical solution, but not all embodiments. Based on the embodiments described in the present application documents, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the technical solution of the present invention.

[0052] Memory is an important component of the device. All programs on the device need to be run in the memory. However, the memory resources on the device are very limited. In the deep learning algorithm, the memory consumption will continue to increase as the scale of the neural networ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a memory management method and device in a neural network forward calculation process. The method comprises the steps of obtaining layer outputs of layers; for each layer output, executing the following operations to realize memory allocation for each layer output: if a first memory block with a counting mark as a first mark in a memory pool is found, then allocating the first memory block to the current layer output, and changing the first mark of the first memory block to a second mark, wherein the counting mark is used for representing a use state of the corresponding memory block, the first mark indicates that the memory block is in a reusable state, and the second mark indicates that the memory block is in a non-reusable state; and obtaining layer inputs of thelayers, determining a previous layer output of each layer input, and changing the counting mark of the memory block of the previous layer output, so that the memory block of the previous layer outputcan be reused, wherein the previous layer output represents the layer output of the layer which transmits data to the layer input.

Description

technical field [0001] The invention relates to the field of computers, in particular to a memory management method and device in the forward calculation process of a neural network. Background technique [0002] Memory is an important component of the device. However, the memory resources of the device are very limited. In the deep learning algorithm, the memory consumption will continue to increase with the expansion of the neural network. The neural network is divided into several layers. In the forward calculation process, the neural network obtains data from the input layer, and performs calculations layer by layer in sequence. Its specific performance is to obtain data from the output memory of the previous layer, perform calculations in this layer, and save the calculation results in the output memory of this layer. At the same time, the calculation results of this layer are passed to the next layer. [0003] In the memory management method of the existing neural net...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/02G06N3/02
CPCG06F12/023G06N3/02
Inventor 刘金鸽
Owner ZHEJIANG DAHUA TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products