Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Network processor and network operation method

An operation method and processor technology, applied in the field of artificial intelligence, can solve problems such as complex processor structure, waste of hardware resources, and large optimization space, and achieve the effects of reducing power consumption, simplifying multiplication operations, and speeding up operations

Pending Publication Date: 2019-02-19
SHANGHAI CAMBRICON INFORMATION TECH CO LTD
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, since the single-core neural network processor needs to be compatible with most neural network models, it needs to provide support for the existing different types of neural networks and neural network operations of different scales, which makes the structure of the existing single-core neural network processor complex. , expensive, and for small-scale, simple-structured neural network operations and operations of simple neural network models such as spiking neural networks (SNN), there are also problems of wasting hardware resources and excessive power consumption. At the same time, a single The core neural network processor does not accelerate the parallelism between different layers in the neural network operation process, and there is still a lot of room for optimization

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Network processor and network operation method
  • Network processor and network operation method
  • Network processor and network operation method

Examples

Experimental program
Comparison scheme
Effect test

specific example 1

[0187] Specific examples are as Figure 1N As shown, the weight data is 16-bit floating-point data, the sign bit is 0, the power bit is 10101, and the effective bit is 0110100000, so the actual value it represents is 1.40625*2 6 . The power neuron data sign bit is 1 bit, and the power bit data bit is 5 bits, that is, m is 5. The coding table is that when the power bit data is 11111, the corresponding power neuron data is 0, and when the power bit data is other values, the power bit data corresponds to the corresponding binary complement. If the power neuron is 000110, the actual value it represents is 64, which is 2 6 . The result of adding the power bit of the weight to the power bit of the power neuron is 11011, and the actual value of the result is 1.40625*2 12 , which is the product result of neurons and weights. Through this arithmetic operation, the multiplication operation becomes an addition operation, which reduces the amount of calculation required for calculati...

specific example 2

[0188] Specific example two such as Figure 1O As shown, the weight data is 32-bit floating-point data, the sign bit is 1, the power bit is 10000011, and the effective bit is 10000000000000000, so the actual value it represents is -1.5703125*2 4 . The power neuron data sign bit is 1 bit, and the power bit data bit is 5 bits, that is, m is 5. The coding table is that when the power bit data is 11111, the corresponding power neuron data is 0, and when the power bit data is other values, the power bit data corresponds to the corresponding binary complement. If the power neuron is 111100, the actual value it represents is -2-4. (The result of adding the power bit of the weight to the power bit of the power neuron is 01111111, and the actual value of the result is 1.5703125*2 0 , which is the product result of neurons and weights.

[0189] In step S1-3, the first power conversion unit converts the neuron data after the neural network operation into power neuron data.

[0190] ...

example 1

[0366] Assuming that the data to be screened is a vector (1 0 101 34 243), and the components that need to be screened are less than 100, then the input position information data is also a vector, that is, a vector (1 1 0 1 0). The filtered data can still maintain the vector structure, and the vector length of the filtered data can be output at the same time.

[0367] Wherein, the position information vector can be input externally or generated internally. Optionally, the device in the present disclosure may further include a location information generation module, which may be used to generate a location information vector, and the location information generation module is connected to the data screening unit. Specifically, the position information generating module may generate a position information vector through vector operation, and the vector operation may be a vector comparison operation, that is, it is obtained by comparing the components of the vector to be screened ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a network processor and a network operation method. The network processor comprises a memory, a high speed temporary memory and a heterogeneous core. Wherein the memory is usedfor storing data and instructions of neural network operation; The high-speed temporary memory is connected with the memory through a memory bus; The heterogeneous kernel is connected with the high-speed temporary memory through a high-speed temporary memory bus, reads the data and instructions of the neural network operation through the high-speed temporary memory, completes the neural network operation, sends the operation result back to the high-speed temporary memory, and controls the high-speed temporary memory to write the operation result back to the memory. The network processor and the network operation method disclosed by the invention can reduce the power consumption overhead of the network calculation and can fully utilize the parallelism of the network, thereby reducing the cost of the network operation and improving the efficiency of the network operation.

Description

[0001] This disclosure is a divisional application of the Chinese patent application number 201880001242.9, and the contents of the parent patent are all cited here. technical field [0002] The present disclosure relates to the technical field of artificial intelligence, and more specifically relates to a network processor and a network computing method. Background technique [0003] Artificial neural network (ANN) abstracts the human brain neuron network from the perspective of information processing, establishes a simple model, and forms different networks according to different connection methods. At present, artificial neural networks have made great progress in many fields, and are widely used to solve practical problems in pattern recognition, intelligent robots, automatic control, predictive estimation, biology, medicine, economics and other fields. [0004] As a new type of special-purpose processor, the single-core neural network processor uses special instructions...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063
CPCG06N3/063G06F17/16G06T1/20G06T3/4046G06N3/048Y02D10/00G06F16/162G06F9/30025G06F9/30083G06F9/3802
Inventor 不公告发明人
Owner SHANGHAI CAMBRICON INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products