Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Resistive processing unit architecture with separate weight update and inference circuitry

A weight update and processing unit technology, applied in neural architecture, physical implementation, biological neural network models, etc.

Active Publication Date: 2020-11-27
IBM CORP
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, despite these requirements, tunable resistive devices may exhibit limited dynamic range, resolution, and variability in tuning / programming characteristics, making the hardware implementation of the RPU architecture non-trivial

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Resistive processing unit architecture with separate weight update and inference circuitry
  • Resistive processing unit architecture with separate weight update and inference circuitry
  • Resistive processing unit architecture with separate weight update and inference circuitry

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] Embodiments of the present invention will now be discussed in more detail with respect to RPU cell architectures and methods in which individual matrices are utilized to independently perform individual Weight update accumulation and inference (weight read) operations. It should be noted that the same or similar reference numerals are used throughout the drawings to denote the same or similar features, elements or structures, and therefore, reference to the same or similar features, elements or structures will not be repeated for each drawing detailed instructions.

[0020] figure 1 It schematically shows an RPU system 100 that can be implemented with an RPU unit architecture according to an embodiment of the present invention. The RPU system 100 includes a two-dimensional (2D) interleaved array of RPU units 110 arranged in a plurality of rows R1, R2, R3, . . . , Rm and a plurality of columns C1, C2, C3, . . . , Cn. The RPU units 110 in each row R1, R2, R3, ..., Rm a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods are provided to perform weight update operations in a resistive processing unit (RPU) system to update weight values of RPU devices comprising tunable resistive device. A weight update operation for a given RPU device includes maintaining a weight update accumulation value for the RPU device, adjusting the weight update accumulation value by one unit update value in response toa detected coincidence of stochastic bits streams of input vectors applied on an update row and update column control lines connected to the RPU device, generating a weight update control signal in response to the accumulated weight value reaching a predefined threshold value, and adjusting a conductance level of the tunable resistive device by one unit conductance value in response to the weightupdate control signal, wherein the one unit conductance value corresponds to one unit weight value of the RPU device.

Description

technical field [0001] The present disclosure relates generally to resistive processing unit (RPU) architectures, and in particular to techniques for updating and reading weight values ​​stored in RPU memory units. Background technique [0002] In recent years, deep neural network (DNN) based models have achieved remarkable progress due to the availability of large labeled datasets and continuous improvements in computing resources. DNNs are used in different applications including, for example, object / speech recognition, language translation, pattern extraction, and image processing. The quality of DNN models depends on the processing of large amounts of training data and increasingly complex neural networks. In this regard, training complex DNN models is a time-consuming and computationally intensive task that may require many days or weeks to perform using a parallel and distributed computing architecture with many computing nodes (e.g., data center-scale computing resou...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08
CPCG06N3/084G06N3/065G06N3/045G06N3/08G06N3/04
Inventor 金世荣T·格克曼
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products