Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Device, board card and method for executing neural network calculation and readable storage medium

A technology of neural network and execution file, applied in the field of neural network, can solve problems such as resource consumption and delayed operation time, and achieve the effect of reducing input/output overhead

Pending Publication Date: 2022-04-15
CAMBRICON TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Both multi-layer numbers and multi-parameters require a large number of on-chip and off-chip I / O accesses, which will consume a lot of resources and delay computing time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Device, board card and method for executing neural network calculation and readable storage medium
  • Device, board card and method for executing neural network calculation and readable storage medium
  • Device, board card and method for executing neural network calculation and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The following will clearly and completely describe the technical solutions in the embodiments of the present disclosure with reference to the drawings in the embodiments of the present disclosure. Obviously, the described embodiments are part of the embodiments of the present disclosure, not all of them. Based on the embodiments in the present disclosure, all other embodiments obtained by those skilled in the art without creative efforts fall within the protection scope of the present disclosure.

[0050] It should be understood that the terms "first", "second", "third" and "fourth" in the claims, specification and drawings of the present disclosure are used to distinguish different objects, rather than to describe a specific order . The terms "comprising" and "comprises" used in the specification and claims of this disclosure indicate the presence of described features, integers, steps, operations, elements and / or components, but do not exclude one or more other featur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present disclosure relates to a device, a board card, a method and a readable storage medium for performing neural network computing, where the computing device of the present disclosure is included in an integrated circuit device including a universal interconnection interface and other processing devices. And the computing device interacts with other processing devices to jointly complete the computing operation specified by the user. The integrated circuit device can further comprise a storage device, and the storage device is connected with the computing device and the other processing devices and used for data storage of the computing device and the other processing devices.

Description

technical field [0001] The present disclosure relates generally to the field of neural networks. More specifically, the present disclosure relates to a device, a board, a method, and a readable storage medium for performing neural network calculations. Background technique [0002] A neural network is a system of multiple neurons connected according to certain rules. It is roughly composed of the following four layer structures: input layer, convolution layer, pooling layer, fully connected layer ( fully connected layer). [0003] The input layer intercepts part of the information from the input data and converts it into a feature matrix, which contains the features corresponding to the part of the information. The convolutional layer is configured to receive the feature matrix from the input layer, and perform feature extraction on the input data through a convolution operation. In actual use, the convolutional layer can build multiple convolutional layers. The pooling ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/04G06N3/08G06K9/62G06V10/80G06V10/82
Inventor 不公告发明人
Owner CAMBRICON TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products