Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data processing module, data processing system and data processing method

A technology for processing modules and processing facilities, which is applied in the field of mapping of elements and their synapses, and can solve problems such as increased power consumption and waste of resources

Pending Publication Date: 2021-07-30
格雷玛特实验室股份有限公司
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In either case, this results in wasted resources and increased power consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing module, data processing system and data processing method
  • Data processing module, data processing system and data processing method
  • Data processing module, data processing system and data processing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0069] The input synapse ID is represented by the address index of the memory cell itself (no memory bits are used for this information). Each addressable entry in this memory unit corresponds to a specified synapse. The depth of the storage unit is a1.

[0070] In the example shown, the field Neural Unit ID includes an identifier for the neuron. The required size b1 of this field is 2log of the number of neurons (eg for a data processing module with 256 neurons this field would be 8 bits).

[0071] The second field contains a value representing the synapse weight assigned to the synapse. The number of bits b2 of this field can be smaller or larger depending on the desired granularity used to specify synaptic weights. In the example, the number of bits of this field is 32 bits.

[0072] Mapping example

[0073] The table below shows the image 3 The filling content of this storage unit 14 of the exemplary network is shown. The exemplary network has three neurons (N0, ...

example 3

[0119] As another example, the contents of memories 12, 13, and 14 are described for unsigned memories, as Figure 2D shown.

[0120] Table 11: Exemplary Input Synaptic Storage Units

[0121] Enter Synapse ID neuron ID synaptic weight D0 N1 we D1 N2 0.5we D2 N1 wi D3 N3 wacc D4 N3 -wacc D5 N5 we D6 N3 wacc D7 N5 we

[0122] Table 12: Exemplary Output Synaptic Storage Units

[0123] output synapse ID synaptic delay Destination ID Enter Synapse ID A0 Tsyn NEX D0 A1 Tsyn NEX D1 A2 Tsyn NEX D2 A3 Tsyn+Tmin NEX D3 A4 Tsyn NEX D4 A5 Tsyn NEX D5 A6 Tsyn NEX D6 A7 2*Tsyn+Tneu NEX D7 - - - - - - - -

[0124] Table 13: Output Synaptic Slice Storage Units

[0125] neuron ID Offset number of output synapses N0 0 2 N1 2 2 N2 4 1 N3 5 1 N4 6 2 N5 - - ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A neuromorphic processing module (1) for time-multiplexed execution of a spiking neural network is provided that comprises a plurality of neural units. Each neural unit is capable of assuming a neural state, and has a respective addressable memory entry in a neuron state memory unit (11) for storing state information specifying its neural state. The state information for each neural unit is computed and updated in a time-multiplexed manner by a processing facility (10, neural controller) in the processing module, depending on event messages destined for said neural unit. When the processing facility computes computing that an updated neural unit assumes a firing state, it resets the updated neural unit to an initial state, accesses a respective entry for the updated neural unit in an output synapse slice memory unit, and retrieves from said respective entry an indication for a respective range of synapse indices, wherein the processing facility for each synapse index in the respective range accesses a respective entry in a synapse memory unit, retrieves from the synapse memory unit synapse property data and transmits a firing event message to each neural unit associated with said synapse property data.

Description

Background technique [0001] The advent of cognitive computing has proposed neurocomputing as an alternative computing paradigm based on the manipulation of the human brain. Neural computing devices can alleviate the von Neumann memory bottleneck due to their inherently parallel architecture. Inspired by biological principles, neural computing devices are designed as neural units that interact with each other through synaptic connections. IC implementations of these artificial neural computing devices are typically digital in nature, in contrast to their similarly operating biological counterparts. [0002] This, on the one hand, facilitates the implementation of ICs for artificial neural computing devices in silicon, and on the other hand, provides the opportunity to take advantage of the enormous technological advances that have been made in the decades of digital integrated circuit design. In contrast to currently available digital processing elements, biological neurons o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/063
CPCG06N3/049G06N3/063
Inventor 赛义德·扎希德·艾哈迈德丹尼尔·波尔托洛蒂朱利安·雷纳尔德
Owner 格雷玛特实验室股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products