Unlock instant, AI-driven research and patent intelligence for your innovation.

Data management method, neural network processor and terminal equipment

A neural network and processor technology, applied in the fields of data management, neural network processors and terminal equipment, can solve the problems of slow calculation speed and long data caching process.

Pending Publication Date: 2020-10-20
SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present application provides a data management method, a neural network processor and a terminal device, which can improve the problems of long time-consuming data caching and slow calculation speed in the neural network calculation process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data management method, neural network processor and terminal equipment
  • Data management method, neural network processor and terminal equipment
  • Data management method, neural network processor and terminal equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The data management method provided by the embodiment of the present application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR) / virtual reality (virtual reality, VR) equipment, notebook computers, super mobile personal computers In the neural network processors of terminal devices such as ultra-mobile personal computer (UMPC), netbook, and personal digital assistant (personal digital assistant, PDA), the embodiment of the present application does not impose any limitation on the specific type of the terminal device.

[0024] It should be understood that the first, second and various numbers mentioned herein are only for convenience of description, and are not intended to limit the scope of the present application.

[0025] It should also be understood that the term "and / or" in this article is only an association relationship describing associated objects, indicating that there may be...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is applicable to the technical field of neural networks, and provides a data management method, a neural network processor and terminal equipment. The neural network processor to which the method is applied comprises a task manager, a memory unit, a controller, a cache space, a computing unit, a first DMA for writing data into the cache space, and a second DMA for reading data from the cache space. The method comprises the steps that the task manager configures working parameters of each DMA according to a data caching strategy, wherein the working parameters comprise a data storage mode, the size of an access space, an access starting address and an access ending address; and each DMA and the controller jointly control the transmission and storage of the operation data amongthe memory unit, the cache space and the computing unit according to the working parameters. According to the method provided by the embodiment of the invention, the duration of caching the operationdata by the neural network processor can be reduced, and the calculation rate of the neural network is improved.

Description

technical field [0001] The present application belongs to the technical field of neural network, and in particular relates to a data management method, a neural network processor and terminal equipment. Background technique [0002] In the neural network processor, a cache space, such as Static Random Access Memory (SRAM), is provided between the memory unit for storing operation data and the computing unit. [0003] Used to cache computing data. [0004] The buffer space is configured with an input storage area, a weight storage area and a result storage area to simultaneously store different operation data, including input data, weight data and result data. At present, the size and storage mode of each storage area are usually fixed, but as the neural network calculation proceeds, the structure of the calculation data is constantly changing, for example, the input data will gradually decrease, while the weight data will gradually increase, making the weight The storage a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/0802G06F12/02G11C11/41G11C11/413G06N3/063G06N3/04G06F3/06
CPCG06F12/0802G11C11/41G11C11/413G06F12/023G06N3/063G06F3/0604G06N3/045
Inventor 曹庆新李炜
Owner SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More