Unlock instant, AI-driven research and patent intelligence for your innovation.

Expandable and real-time recofigurable hardware for neural networks and logic reasoning

A technology for learning networks and reconstructing circuits, applied in the field of hardware systems of neural networks, and can solve problems such as joint processing power limited by the size and processing speed of a single FPGA

Inactive Publication Date: 2019-07-16
LIANG PING
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For a single CPU, joint processing power is limited by the size and processing speed of a single FPGA mounted on the CPU

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Expandable and real-time recofigurable hardware for neural networks and logic reasoning
  • Expandable and real-time recofigurable hardware for neural networks and logic reasoning
  • Expandable and real-time recofigurable hardware for neural networks and logic reasoning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0010] Reference can be made to the drawings, wherein like reference numerals refer to like parts throughout. The illustrated embodiments of the invention are provided to illustrate various aspects of the invention and should not be construed as limiting the scope of the invention. When embodiments of the embodiments are described with reference to block diagrams or flowcharts, each block represents a method step or both an apparatus element for performing the method step. Depending on the implementation, the corresponding device elements may be configured as hardware, software, firmware or a combination thereof. In this disclosure, the terms "neural network" or "learning network" are used interchangeably to denote an information processing structure, which may be characterized by a graph of layers or clusters of processing nodes and interconnections between processing nodes, including but not Limited to feed-forward neural networks, deep learning networks, convolutional neur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This invention presents a scalable field-reconfigurable learning network and machine intelligence system that is reconfigured to match the architecture or processing flow of a selected deep learning neural network and well suited for combining neural network learning and logic reasoning. It partitions the N layers, clusters or stages of the selected learning network into multiple parts with inter-parts connections to a plural of field-reconfigurable processing modules. The inter-parts connections are configured into a field-reconfigurable processing and interconnection module. Multiple field-reconfigurable learning networks can be interconnected to produce a larger scale field-reconfigurable learning network, and can be connected to the Internet to provide a field-reconfigurable learning network cloud service.

Description

technical field [0001] The present invention relates to a field-reconfigurable neural network (field-reconfigurable neural network) and a machine intelligence system, and in particular to a scalable and field-reconfigurable hardware system for a neural network, which is compatible with the structure and processing of a neural network and logical reasoning Process matching. Background technique [0002] There are two phases to using neural networks for machine learning, training, and inference. During the training phase, the computing engine needs to be able to process a large number of training examples quickly, so both fast processing and fast I / O are required. In the inference stage, the computing engine needs to receive input data and process inference results in real time in multiple applications. In both stages, the computing engine needs to be configured to execute the different neural network architectures that are most appropriate for the learning task, for example...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/06
CPCG06N3/063G06N3/08G06N3/042G06N3/044G06N3/045G06N3/04
Inventor 梁平B·梁
Owner LIANG PING
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More