Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Design and training of binary neurons and binary neural networks using error correction codes

A neural network, error correction technology, applied in the field of deep neural network, can solve the problem of no proposed training method and so on

Pending Publication Date: 2022-05-06
HUAWEI TECH CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Furthermore, no training method is proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Design and training of binary neurons and binary neural networks using error correction codes
  • Design and training of binary neurons and binary neural networks using error correction codes
  • Design and training of binary neurons and binary neural networks using error correction codes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] The present invention proposes a purely binary neural network by introducing an architecture based on binary domain operations, and transforms the training problem into a communication channel decoding problem.

[0062] Artificial neurons have many inputs and weights, and return one output. It is usually thought of as a function parameterized by weights that takes the input of a neuron as a variable and returns an output. In the context of BNNs, of particular interest is the development of binary neurons whose inputs, weights, and outputs are all binary numbers. Interestingly, such a structure of binary neurons conforms to Boolean functions.

[0063] Let m be the number of variables, and is a binary m-tuple. Boolean function is to take F 2 Any function of values ​​in f(x)=f(x 1 ,x 2 ,...,x m ). It can be defined by all 2 m A truth table specification of the value of f at input combinations. Table 1 gives an example of a truth table for a binary sum function (...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A data processing system having a binary neural network architecture for receiving a binary network input and propagating signals via a plurality of binary processing nodes in accordance with the network input to form a network output in accordance with respective binary weights, the data processing system is configured to train each of a plurality of binary processing nodes by implementing a node function as an error correction code (e.g., an r-order Reed-Muller code, e.g., an order 1 Reed-Muller code or a coset of the order 1 Reed-Muller code) function to decode the data through a channel (e.g., a channel decoding code). A binary weight group is identified for a given input of the node that minimizes any error between an output of the node formed from a current binary weight of the node and a preferred output from the node, and the weight of the node is updated to the identified binary weight group. Such training is performed without storing and / or using any weight or other element of higher operational accuracy.

Description

technical field [0001] The present invention relates to deep neural networks, in particular to the design and training of binary neurons and binary neural networks. Background technique [0002] A deep neural network (DNN) is a computing system inspired by the biological neural networks that make up the biological brain. DNNs "learn" to perform tasks by considering examples that are typically not programmed with any task-specific rules. In image recognition, for example, they can learn to recognize images containing cars by analyzing example images that have been manually labeled "car" or "no car" and using the results to identify cars in other images. They do this without any prior knowledge about the car. Instead, they automatically generate identifying features based on the learning material being processed. figure 1 A general scheme of this process is shown. [0003] DNNs are based on collections of connected units, or nodes, called artificial neurons, which loosely ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H03M13/00G06N3/02G06N3/08H03M13/13H03M13/21
CPCH03M13/6597H03M13/6577G06N3/08H03M13/136H03M13/21G06N3/063G06N3/048H03M13/611
Inventor 简-克洛德·贝尔菲奥里乔治奥·帕斯科迪米屈斯·提斯里曼妥思阿波斯特劳斯·德斯托尼斯斯皮里顿·瓦西拉拉斯玛丽娜·科斯坦蒂尼尼古拉斯·利亚科普洛斯阮万明梅鲁安·黛巴
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products