Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

In-memory computing circuit chip based on magnetic cache and computing device

A computing circuit and magnetic technology, applied in the computer field, to achieve the effect of increasing data storage capacity, high density, and large capacity

Pending Publication Date: 2021-10-22
NANJING HOUMO TECH CO LTD
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the reading of weight data can be avoided, the input and output data (input / output activation) still need to access the on-chip buffer (buffer)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • In-memory computing circuit chip based on magnetic cache and computing device
  • In-memory computing circuit chip based on magnetic cache and computing device
  • In-memory computing circuit chip based on magnetic cache and computing device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] Hereinafter, exemplary embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. Apparently, the described embodiments are only some of the embodiments of the present disclosure, rather than all the embodiments of the present disclosure, and it should be understood that the present disclosure is not limited by the exemplary embodiments described here.

[0029] It should be noted that relative arrangements of components and steps, numerical expressions and numerical values ​​set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.

[0030] Those skilled in the art can understand that terms such as "first" and "second" in the embodiments of the present disclosure are only used to distinguish different steps, devices or modules, etc. necessary logical sequence.

[0031] It should also be understood that in the embodiments of the present disclosure, "...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses an in-memory computing circuit based on a magnetic cache, and the circuit comprises at least one magnetic cache unit, at least one in-memory computing unit, and a timer. The magnetic cache unit in the at least one magnetic cache unit is used for caching data output by the corresponding in-memory computing unit as to-be-processed data within the corresponding data retention time; the timer is used for respectively setting data retention time for the at least one magnetic cache unit; and the in-memory computing unit in the at least one in-memory computing unit is used for extracting the data to be processed from the corresponding magnetic cache unit for calculation and outputting the computed data to other magnetic cache units. According to the embodiment of the invention, the invention achieves the flexible adjustment of the data retention time of the magnetic cache unit in various in-memory calculation scenes, and achieves the provision of a high-capacity cache for the data needed by in-memory computing under the lower power consumption.

Description

technical field [0001] The disclosure relates to the technical field of computers, in particular to an in-memory computing circuit, chip and computing device based on a magnetic cache. Background technique [0002] With the rapid development of artificial intelligence (AI) and Internet of Things (IoT) applications, frequent and massive communication between the central processing unit (CPU) and the memory circuit (Memory) is required via limited bus bandwidth. Data transmission is also recognized as the biggest bottleneck in the current traditional von Neumann architecture. As one of the most successful algorithms applied to image recognition in the field of artificial intelligence, deep neural network needs to do a lot of reading and writing, multiplication and addition operations on input data and weight data. This also means that a larger number of data transfers and more power consumption are required. It is worth noting that under different AI tasks, the energy consum...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G11C11/16G06N3/04G06N3/063
CPCG11C11/1675G06N3/063G06N3/045
Inventor 吴强常亮司鑫陈亮沈朝晖
Owner NANJING HOUMO TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products