Memory management method, device and storage medium

A memory management and memory technology, applied in the computer field, can solve the problems of occupation, neural network occupying a large amount of memory, limited terminal memory, etc., and achieve the effect of realizing memory reuse, reducing memory requirements, and expanding functions.

Active Publication Date: 2022-02-01
TENCENT TECH (SHENZHEN) CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Deep learning is inseparable from the implementation of neural networks. Traditional neural networks are usually implemented on the server side. However, with the continuous development of AI algorithms, the way to implement neural networks on the server side can no longer meet people's growing business needs. Propose a way to realize the neural network in the terminal
[0004] However, the neural network includes multiple network layers, and each network layer includes multiple feature units. Each feature unit needs to occupy a piece of memory when the output data is calculated, resulting in the need to occupy a large amount of memory when implementing the neural network.
Compared with the server side, the terminal has limited memory and cannot meet the memory requirements of the neural network, making it difficult to realize the neural network in the terminal.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory management method, device and storage medium
  • Memory management method, device and storage medium
  • Memory management method, device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0025] Before describing the embodiment of the present invention in detail, first, the neural network involved in the embodiment of the present invention is described as follows:

[0026] figure 1 is a schematic structural diagram of a neural network provided by an embodiment of the present invention, see figure 1 , the neural network includes a plurality of network layers, each network layer includes at least one feature unit, and the feature un...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a memory management method, device and storage medium, belonging to the technical field of computers. The method includes: determining at least one branch of the neural network according to the connection relationship of the characteristic units in the neural network; for each branch, according to the size of the output memory required by each characteristic unit on the branch, allocating the first memory and the second memory for the branch , the first memory size is not smaller than the second memory size, and neither the first memory size nor the second memory size is smaller than other memory sizes required by the branch; the first memory and the second memory are used in turn as the input memory and output of the feature unit on the branch Memory. The embodiment of the present invention only allocates two memories for each branch in the neural network, and takes turns as the input memory and output memory of the feature unit, which can not only ensure the normal progress of the calculation, but also realize memory multiplexing, saving the occupied memory, The memory requirement is reduced to ensure that the neural network can be implemented normally on the terminal.

Description

technical field [0001] The embodiments of the present invention relate to the technical field of computers, and in particular to a memory management method, device and storage medium. Background technique [0002] In recent years, deep learning has been widely used in the fields of speech recognition and computer vision, and with the rapid development of deep learning, AI (Artificial Intelligence, artificial intelligence) algorithms are constantly emerging, and rapidly change the development of technology. direction and people's lives. [0003] Deep learning is inseparable from the implementation of neural networks. Traditional neural networks are usually implemented on the server side. However, with the continuous development of AI algorithms, the way to implement neural networks on the server side can no longer meet people's growing business needs. Propose a way to implement neural networks in the terminal. [0004] However, the neural network includes multiple network l...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/50G06N3/04
CPCG06F9/5016G06N3/045
Inventor 黄凯宁朱晓龙梅利健黄生辉王一同罗镜民
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products