Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

cache data processing method and cache

A data processing and data technology, applied in the field of Cache and Cache data processing, can solve problems such as the impact of CPU operation efficiency, avoid side channel attacks, and achieve the effect of simple hardware implementation

Active Publication Date: 2021-06-15
BEIJING SMARTCHIP MICROELECTRONICS TECH COMPANY +3
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method is simple to implement, but has a great impact on CPU operating efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • cache data processing method and cache
  • cache data processing method and cache
  • cache data processing method and cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The specific embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings, but it should be understood that the protection scope of the present invention is not limited by the specific embodiments.

[0031] Unless expressly stated otherwise, throughout the specification and claims, the term "comprise" or variations thereof such as "includes" or "includes" and the like will be understood to include the stated elements or constituents, and not Other elements or other components are not excluded.

[0032] In order to overcome the shortcoming of prior art, the present invention provides a kind of Cache data processing method and Cache, supports the Cache of Main memory to Cache Memory random mapping, and existing Cache is different, the correspondence between Mainmemory and Cache line of the present invention Relationships are not fixed. It can effectively avoid side-channel attacks based on Cache access time, and the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Cache data processing method and a Cache. The cache data processing method includes: Cache receiving the address information of the Main memory sent by the CPU, wherein the address of the Main memory includes a Main memory Tag field, a Cache set index field; the Cache searches the Lookup table according to the Main memory Tag field to obtain a mapping offset value; the Cache adds the mapping offset value and the value of the Cache set index field to obtain the Cache the line address; the Cache reads the Cache memory Tag stored in the line address, and compares the Main memory Tag field with the Cache memory Tag, and if the two are consistent, the Cache sends the The CPU outputs the data stored in the line address. The Cache data processing method and the Cache can effectively avoid the side channel attack based on the Cache access time, and the hardware implementation is simple, and the operating efficiency of the CPU will not be greatly affected.

Description

technical field [0001] The invention relates to the technical field of chips, in particular to a Cache data processing method and a Cache. Background technique [0002] At present, the operating speed of the processor can reach 5GHz, and the access speed of commonly used large-capacity memory is generally between tens of megahertz and hundreds of megahertz. Because of the huge speed gap, the processor often needs to wait when accessing the main storage, resulting in a decrease in the operating efficiency of the computer. [0003] Therefore, in order to improve the operating efficiency of the computer, a cache memory (Cache) is usually provided between the central processing unit (CPU) and the mass memory (Mainmemory). Cache generally runs at the same or close speed to the CPU, and part of the storage space in the main memory is mapped to the cache. If the CPU access address matches (hits) the address mapped in the Cache, the CPU accesses data from the Cache at a very fast ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F7/58G06F12/0806G06F12/0877G06F12/0897
CPCG06F7/588G06F12/0806G06F12/0877G06F12/0897
Inventor 褚军舰刘亮李伟立张海峰窦圣霞严绍奎
Owner BEIJING SMARTCHIP MICROELECTRONICS TECH COMPANY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products