Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An in-memory computing bit unit and an in-memory computing device

A technology of bit cells and storage cells, applied in the field of in-memory computing, can solve the problems of wasting computing time and power consumption, leaking power consumption, and having no relative advantage in computing throughput.

Active Publication Date: 2021-05-18
中科南京智能技术研究院
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The calculation method of traditional single-bit input multiplied by single-bit weight is inefficient, and there is no relative advantage in calculation throughput; the use of 6T structure for weight storage will increase the process cost; and in the calculation process, the traditional calculation method will exist whether the input and weight are 1 or 0 is all calculated. For the redundancy of multiplying the calculation number and 0, the calculation time and power consumption are greatly wasted, and the leakage of the output bit line during the calculation process will also cause leakage power consumption. The problem needs to be solved urgently

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An in-memory computing bit unit and an in-memory computing device
  • An in-memory computing bit unit and an in-memory computing device
  • An in-memory computing bit unit and an in-memory computing device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] Such as figure 1 As shown, the present invention discloses an in-memory computing bit unit, and the in-memory computing bit unit includes:

[0040] A four-pipe storage unit and a four-pipe calculation unit, the four-pipe calculation unit is connected to the four-pipe storage unit; the bit line input end of the four-pipe storage unit is connected to the bit line BL, and the reverse side of the four-pipe storage unit The input terminal of the bit line is connected to the reverse bit line BLB, and the input terminal of the word line of the four-tube storage unit is connected to the word line WL; the four-tube storage unit is used for reading, writing and storing weight values; the four-tube calculation unit It is used for multiplying the input data and the weight value; the input data is determined according to the calculation word line CWL and the reverse calculation word line CWLB.

[0041] The four-tube calculation unit includes transistor T5, transistor T6, transistor...

Embodiment 2

[0059] Such as Figure 4 As shown, the present invention also provides an in-memory computing device, which includes: a bit line / prestored decoding driver ①, a word line decoding driver ③, a calculation word line decoding driver ②, an in-memory computing array and n An analog-to-digital converter ⑤; the in-memory computing array includes m×n above-mentioned in-memory computing bit units ④ arranged in an array.

[0060] The n bit line output terminals of the bit line / pre-storage decoding driver ① are respectively connected to n bit lines BL, and the 2n pre-fill line output terminals of the bit line / pre-storage decoding driver ① are connected to n pre-fill line output terminals respectively. Line A is connected to n pre-charge lines B, and the n anti-bit line output terminals of the bit line / pre-stored decoding driver ① are respectively connected to n anti-bit lines BLB; the m anti-bit line output terminals of the word line decoding driver ③ The word line output terminals are r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an in-memory calculation bit unit and an in-memory calculation device, comprising: a four-tube storage unit and a four-tube calculation unit, the four-tube calculation unit includes a transistor T5, a transistor T6, a transistor T7 and a transistor T8; the drain of the transistor T7 It is connected with the pre-stored line A, the gate of the transistor T7 is connected with the calculation word line, the source of the transistor T7 is connected with the drain of the transistor T5, the gate of the transistor T5 is connected with the four-transistor storage unit, and the source of the transistor T5 is connected with the drain of the transistor T5. The source of the transistor T6 is connected, the gate of the transistor T6 is connected with the four-transistor storage unit, the drain of the transistor T6 is connected with the drain of the transistor T8, the gate of the transistor T8 is connected with the reverse computing word line, and the source of the transistor T8 The pole is connected to the pre-stored line B; the source of the transistor T5 and the source of the transistor T6 are both connected to the read bit line RBL. The present invention directly adopts the hold state when the design weight value is 0, which accelerates the calculation process.

Description

technical field [0001] The invention relates to the technical field of in-memory computing, in particular to an in-memory computing bit unit and an in-memory computing device. Background technique [0002] Deep neural networks (DNNs) and convolutional neural networks (CNNs) have achieved unprecedented improvements in the accuracy of large-scale recognition tasks. To address the problem of algorithmic complexity and memory access limitations, in recent algorithms, weights and neuron activations are binarized to +1 or −1, making the multiplication between weights and input activations a simple binary multiplication . [0003] The calculation method of traditional single-bit input multiplied by single-bit weight is inefficient, and there is no relative advantage in calculation throughput; the use of 6T structure for weight storage will increase the process cost; and in the calculation process, the traditional calculation method will exist whether the input and weight are 1 or ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F7/523G06N3/063
CPCG06F7/523G06N3/063
Inventor 乔树山史万武尚德龙周玉梅
Owner 中科南京智能技术研究院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products