Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network processor, current neural network data multiplexing method and related apparatus

A convolutional neural network and neural network technology, applied in the fields of electronic equipment and storage media, neural network processors, and convolutional neural network data multiplexing devices, can solve problems such as large power consumption and slow speed, and achieve improved efficiency, The effect of reducing power consumption and reducing the number of data accesses

Active Publication Date: 2019-05-10
SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
View PDF10 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] One of the most commonly used models in neural network processors is the convolutional neural network model. However, the convolutional neural network model has a series of problems such as slow speed and high power consumption when performing operations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network processor, current neural network data multiplexing method and related apparatus
  • Neural network processor, current neural network data multiplexing method and related apparatus
  • Neural network processor, current neural network data multiplexing method and related apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] Please also see figure 1 and figure 2 As shown, is a schematic diagram of a neural network processor provided by an embodiment of the present invention.

[0057] In this embodiment, the neural network processor 1 may include: a storage circuit 10 and at least one calculation circuit 20 , wherein the calculation circuit 20 is connected to the storage circuit 10 . The neural network processor 1 may be a programmable logic device, such as a Field Programmable Logic Array (Field Programmable Gate Array, FPGA), or a dedicated neural network processor (Application Specific Integrated Circuits, ASIC).

[0058] The number of calculation circuits 20 can be set according to the actual situation, and the number of calculation circuits required can be considered comprehensively according to the entire calculation amount and the calculation amount that each calculation circuit can handle, for example, figure 1 Two computing circuits 20 are shown in parallel.

[0059] In this emb...

Embodiment 2

[0107] Figure 4 It is a flow chart of the convolutional neural network data multiplexing method provided by Embodiment 2 of the present invention.

[0108] The convolutional neural network data multiplexing method can be applied to mobile electronic devices or fixed electronic devices, and the electronic devices are not limited to personal computers, smart phones, tablet computers, desktop computers or all-in-one computers equipped with cameras, and the like. The electronic device stores the initial input data and weight values ​​configured by the user for convolution operations in the storage circuit 10, and reads the initial input data and weight values ​​from the storage circuit 10 by controlling at least one calculation circuit 20 And perform convolution operation based on the initial input data and the weight value. Since the initial input data and weight values ​​required for the convolution operation are uniformly stored in the storage circuit 10, when there are multi...

Embodiment 3

[0167] refer to Figure 5 Shown is a functional block diagram of a preferred embodiment of the convolutional neural network data multiplexing device of the present invention.

[0168] In some embodiments, the convolutional neural network data multiplexing device 50 runs in an electronic device. The convolutional neural network data multiplexing device 50 may include a plurality of functional modules composed of program code segments. The program codes of each program segment in the convolutional neural network data multiplexing device 50 can be stored in the memory of the electronic device, and executed by at least one processor to execute (see for details Figure 4 Description) Data Multiplexing for Convolutional Neural Networks.

[0169] In this embodiment, the convolutional neural network data multiplexing device 50 can be divided into multiple functional modules according to the functions it performs. The functional modules may include: a storage module 501 , a convolut...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A neural network processor includes: a storage circuit that stores initial input data and a weight value required for performing a convolution operation; at least one computing circuit including: a data buffer that caches initial input data; the weight buffer caches the weight value; the convolution arithmetic unit is used for carrying out convolution operation in the current layer of convolutionneural network according to the initial input data and the weight values to obtain a plurality of first convolution results, and accumulating the first convolution results with the corresponding relations to obtain a plurality of second convolution results; at the same time, deleting the plurality of first convolution results; and the result buffer is used for caching the plurality of second convolution products as the initial input data of the next layer of convolutional neural network. The invention further provides a convolutional neural network data multiplexing method and device, electronic equipment and a storage medium. According to the invention, through multi-level data multiplexing, the operation speed of the neural network processor is improved, and the power consumption is reduced.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a neural network processor, a convolutional neural network data multiplexing method, a convolutional neural network data multiplexing device, electronic equipment and a storage medium. Background technique [0002] One of the most commonly used models in neural network processors is the convolutional neural network model. However, the convolutional neural network model has a series of problems such as slow speed and high power consumption when performing operations. Therefore, how to improve the operation speed of the convolutional neural network model in the neural network processor and reduce power consumption has become a technical problem to be solved urgently. Contents of the invention [0003] In view of the above, it is necessary to propose a neural network processor, a method for multiplexing convolutional neural network data, a device for multiplexing c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/063
CPCG06N3/04G06N3/063
Inventor 李炜曹庆新
Owner SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products