Check patentability & draft patents in minutes with Patsnap Eureka AI!

Temporal coding in leaky spiking neural networks

A spiking neural network and spiking technology, applied in the field of neural networks, can solve problems such as inability to train multi-layer networks

Pending Publication Date: 2021-03-02
GOOGLE LLC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, these methods cannot be used to train multi-layer networks
[0011] Training especially spiking networks with multilayer learning (e.g., deep spiking neural networks) remains a challenge

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Temporal coding in leaky spiking neural networks
  • Temporal coding in leaky spiking neural networks
  • Temporal coding in leaky spiking neural networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] overview

[0042] In general, the present disclosure is directed to spiking neural networks that perform temporal encoding for phase-coherent neural computation. In particular, according to an aspect of the present disclosure, a spiking neural network may include one or more spiking neurons with an activation layer that applies a double exponential function (which may also be referred to as an "alpha function") to an incoming ( incoming) neuron spikes provide modeling of leaky input to the membrane potential of spiking neurons. Use a double-exponential function in the temporal transfer function of the neuron to create a more defined maximum in time. This allows very clear and unambiguous state transitions between "now" and "future steps" without loss of phase coherence.

[0043] More specifically, the present disclosure provides biologically-realistic synaptic transfer functions, e.g., in te -t of the form , which is generated by integrating over an exponentially d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Spiking neural networks that perform temporal encoding for phase-coherent neural computing are provided. In particular, according to an aspect of the present disclosure, a spiking neural network can include one or more spiking neurons that have an activation layer that uses a double exponential function to model a leaky input that an incoming neuron spike provides to a membrane potential of the spiking neuron. The use of the double exponential function in the neuron's temporal transfer function creates a better defined maximum in time. This allows very clearly defined state transitions between"now" and the "future step" to happen without loss of phase coherence.

Description

[0001] related application [0002] This application claims priority to and benefit from U.S. Provisional Patent Application No. 62 / 744,150, filed October 11, 2018. US Provisional Patent Application No. 62 / 744,150 is hereby incorporated by reference in its entirety. technical field [0003] The present disclosure generally relates to neural networks. More specifically, the present disclosure relates to leaky spiking neural networks that perform temporal encoding. Background technique [0004] Traditionally, artificial neural networks have been constructed primarily from idealized neurons that generate continuous activation values ​​based on a set of weighted inputs, using non-linear activation layers. Some neural networks have multiple sequential layers of such neurons, in which case these neural networks may be referred to as "deep" neural networks. [0005] Non-spiking neural networks typically use non-linear activation layers that produce continuous-valued outputs to t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08G06N3/00G06N3/063
CPCG06N3/049G06N3/088G06N3/084G06N3/082G06N3/006G06N20/00G06N3/063G06N3/048
Inventor J.阿拉库伊贾拉I-M.科姆萨K.波滕帕
Owner GOOGLE LLC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More