Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network computing system and method

A deep neural network and computing system technology, applied in biological neural network models, physical implementation, etc., can solve problems such as high power consumption and low energy consumption ratio

Inactive Publication Date: 2017-11-10
ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
View PDF2 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the acceleration method of the deep neural network usually adopts the GPU acceleration method. Its highly optimized computing library cudnn and high-performance GPU parallel processing architecture make the acceleration performance of the deep neural network on the GPU platform very superior, but its high power consumption The energy consumption ratio is very low, which is also a great disadvantage

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network computing system and method
  • Deep neural network computing system and method
  • Deep neural network computing system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0030] The embodiment of the present invention discloses a deep neural network computing system, see figure 1 As shown, the system includes:

[0031] CPU11 is used to receive the target data, process it with the deep neural network, and obtain the input layer data of the deep neural network.

[0032] Specifically, CPU11 receives the target data input by the user. The target data can be in the form of a piece of code or a computing task. CPU11 converts the tar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application discloses a deep neural network computing system and method. The system is composed of a CPU for receiving target data and carrying out processing by using a deep neural network to obtain input layer data of the deep neural network and an FPGA for performing a calculation link by using a GEMM calculation node and a parallel operation program in the deep neural network and calculating the input layer data to obtain a calculation result. According to the invention, the GEMM computing node of the deep neural network is transplanted to FPGA and the CPU receives the target data inputted by a user. On the basis of the deep neural network, the target data are transformed into the input layer data of the deep neural network and the transformed data are sent to the FPGA; and the FPGA performs the calculation link by using the GEMM calculation node and the parallel operation program in the deep neural network and calculates the input layer data to obtain the calculation result, thereby completing the calculation. On the basis of the hardware characteristics of the FPGA, the FPGA completes the calculation link, so that the operation energy consumption is reduced substantially and the operation cost is lowered.

Description

technical field [0001] The invention relates to the field of deep neural network accelerated computing, in particular to a deep neural network computing system and method. Background technique [0002] The development of GPU general-purpose computing technology has attracted a lot of attention in the industry. Facts have also proved that in some calculations such as floating-point operations and parallel computing, GPUs can provide dozens or even hundreds of times the performance of CPUs. The standards for GPU general computing currently include OpenCL (Open Computing Language, open computing language), CUDA (Compute Unified Device Architecture), and ATI STREAM. Among them, OpenCL is the first open and free standard for general-purpose parallel programming of heterogeneous systems. It is also a unified programming environment that facilitates software developers to write efficient and portable codes for high-performance computing servers, desktop computing systems, and handh...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063
CPCG06N3/063
Inventor 李磊
Owner ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products