Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A memory data buffering method and device

A technology of memory data and buffering method, applied in the computer field, can solve problems such as slow memory reading speed, affecting request cache response time, etc., to achieve the effect of increasing data cache, reducing times, and reducing delay

Active Publication Date: 2016-09-28
XFUSION DIGITAL TECH CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, if the directory cache hits and the status of the indicated cache line is invalid, although it is not necessary to send a listening request to other cache agents, it must still read data from the memory, and then send the data to the requester cache agent. The memory reading speed is often very slow this time, affecting the response time of the entire request cache

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A memory data buffering method and device
  • A memory data buffering method and device
  • A memory data buffering method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0077] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0078] see figure 1 , which is a schematic flow chart of a first memory data buffering method according to an embodiment of the present invention, and the process includes:

[0079] Step 101, open up a combined cache area including at least a directory cache and a data cache in the memory of the memory agent.

[0080] Specifically, the directory cache is a copy of the directory information memory line loaded into the memory agent in the memory. The state informati...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a memory data buffering method. The method comprises the following steps of: forming a combined cache region which at least comprises a directory cache and a data cache in a memory of a home agent, wherein cache lines in the directory cache and the data cache are in one-to-one correspondence; receiving an operation address which is sent by a cache agent, and judging whether the combined cache region is formed and effective according to the operation address; and if so, directly performing corresponding operation on the combined cache region. By the invention, the frequency of memory access is reduced, and data acquisition delay is reduced.

Description

technical field [0001] The invention relates to the field of computers, in particular to a memory data buffering method and device. Background technique [0002] Modern advanced computer systems are composed of multiple CPUs. In order to increase the data access rate of the CPU and reduce the memory access delay, the Cache function will be implemented on the CPU. Its access speed is generally more than an order of magnitude faster than direct access to memory. Since each CPU will implement a Cache, the memory at the same address will be cached in different CPU Cache. If a CPU writes to this address, the cache in other CPUs must be invalidated. If there is dirty data in the cache must be written back to memory. In order to complete the above functions, modern multiprocessor computers implement the Cache Coherence Protocol (Cache Coherence Protocol) on the memory bus, such as the QPI bus on the Intel CPU platform and the HT bus on the AMT CPU platform. Common typical cache c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/06
Inventor 徐建荣姚策陈昊
Owner XFUSION DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products