Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Artificial neural network hardware implementation device based on probability calculation

An artificial neural network and probability calculation technology, applied in biological neural network models, physical realization, etc., can solve problems such as occupying connection resources, increasing power consumption, and large-scale hardware circuits of neural networks

Active Publication Date: 2016-08-31
SHANGHAI UNIV
View PDF4 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In an integrated circuit, each logic gate needs to occupy a certain hardware area, and the large-scale network structure also occupies a large amount of connection resources, resulting in a very large scale of hardware circuits fo

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Artificial neural network hardware implementation device based on probability calculation
  • Artificial neural network hardware implementation device based on probability calculation
  • Artificial neural network hardware implementation device based on probability calculation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] refer to figure 1 , a kind of artificial neural network hardware implementation device based on probability calculation in a preferred embodiment of the present invention, comprises input module, intermediate module and output module, and described input module comprises 1 input neuron (11), intermediate module It includes J interneurons (12), and the output module includes K output neurons (13), wherein, I, J, and K are all integers greater than or equal to 1. The input neuron (11) receives the first data (71) and outputs the first random data sequence (81). The interneuron (12) receives the first random data sequence (81) and the first random parameter sequence (51), and outputs the second random data sequence (82). The output neuron (13) receives the second random data sequence (82) and the second random parameter sequence (52), and outputs the second data (72). Among them, the first random data sequence (81), the second random data sequence (82), the first random ...

Embodiment 2

[0043] This embodiment is basically the same as Embodiment 1, and the special features are as follows:

[0044] Each interneuron (12) can use the first random data sequence (81) as an input variable and the first random parameter sequence (51) as a function parameter to complete the radial basis function operation, and the operation process uses the probability number (that is, the probability of 0 or 1 appearing in the data sequence within a period of time represents a numerical value), after the operation, the second random data sequence ( 82 ) will be output as the output data of the intermediate neuron ( 12 ). The types of the radial basis functions include, but are not limited to, Gaussian functions, multi-quadratic functions, inverse multi-quadratic functions, thin-plate spline functions, cubic functions, and linear functions.

Embodiment 3

[0046] This embodiment is basically the same as Embodiment 1, and the special features are as follows:

[0047] The first random data sequence (81), the second random data sequence (82), the first random parameter sequence (51), and the second random parameter sequence (52) can all be pseudo-random number sequences or true random number sequences. Its data width can be single-bit data width or multi-bit data width. Usually, the data in these sequences is one bit, that is, each data only needs one wire, which can greatly reduce the interconnection wires inside the network. However, in order to improve the calculation speed, these sequences can also use multi-bit data width to complete parallel calculations and speed up calculations.

[0048] Both the first random parameter sequence (51) and the second random parameter sequence (52) can be a sequence formed by a scalar parameter, or a sequence formed by a set of vector parameters. A scalar parameter means that the sequence rep...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an artificial neural network hardware implementation device based on probability calculation. The artificial neural network hardware implementation device based on probability calculation comprises input, intermediate and output modules. The input module is formed by I input neurons, and the input neurons receive first data and output a first random data sequence; the intermediate module is formed by J intermediate neurons, and the intermediate neurons receive the first random data/parameter sequence and output a second random sequence; and the output module is formed by K output neurons, and the output neurons receive the second random data/parameter sequence and output the second data, wherein I, J and K are integers greater than or equal to 1. The output end of the input neurons is connected with the input end of the intermediate neurons, the output end of the intermediate neurons is connected with the input end of the output neurons, and a complete or partial connection mode is adopted. The first and second random data sequences and the first and second random parameter sequences are expressed by the probability values of 0 or 1 appearing in the data sequences within a period of time. According to the neural network device, hardware logic and wiring resources can be greatly reduced, and circuit cost and power consumption can be reduced so that implementation of a super-large-scale neural network through small and medium-sized circuits is enabled to be possible.

Description

technical field [0001] The invention relates to the field of artificial neural networks, in particular to an artificial neural network hardware realization device based on probability calculation. Background technique [0002] Artificial neural network (ANN, referred to as neural network in this article) is an information processing system that simulates some functions of the human brain to a certain extent by referring to the structure of biological neural network (BNN) and the working mechanism of biological neurons, that is, simulating some functions of biological neurons. The basic function is to build artificial neurons with independent processing capabilities, and then train the neural network composed of a large number of artificial neurons, adjust the interconnection relationship between artificial neurons in the network, and make the network recognize the mapping relationship between input and output. Finally, the purpose of information processing is achieved. [0...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/06
Inventor 季渊陈文栋冉峰王雪纯王成其
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products