Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for accessing data cache and processor

A processor and cache technology, applied in the computer field, can solve problems such as not considering thread sharing data, multiple cache spaces, and inappropriate isolation

Active Publication Date: 2014-12-31
HUAWEI TECH CO LTD +1
View PDF5 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this division method does not take into account the possibility of shared data between threads. If there is shared data but no shared cache, the shared data between threads will have multiple copies in the private cache, thus requiring more cache space. , it is also necessary to maintain the cache consistency of multiple copies; there is another way to achieve cache partitioning through page coloring technology, but this method limits the physical memory space that each thread can use, and can be used for multiple independent processes Good cache isolation is achieved, but in multi-threaded programs for streaming data applications, there are many shared data between threads, and complete isolation is not appropriate
That is to say, page coloring technology is suitable for cache isolation between processes, but not suitable for cache isolation between multiple threads in the same process.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for accessing data cache and processor
  • Method for accessing data cache and processor
  • Method for accessing data cache and processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0030] An embodiment of the present invention provides a processor 01, such as figure 1 As shown, it includes a program counter 011, a register file 012, an instruction prefetching unit 013, an instruction decoding unit 014, an instruction issuing unit 015, an address generation unit 016, an arithmetic logic unit 017, a shared floating point unit 018, a shared instruction cache 019, and the internal bus, also includes:

[0031] Data cache 021, the data cache ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a method for accessing a data cache and a processor, and relates to the field of computers. According to the method and the processor, the range of data search can be narrowed, the access delay is reduced and the performances of the system are improved. A data register of the processor is a first-level cache, wherein the first-level cache comprises a private data cache and a shared data cache; the private data cache comprises a plurality of private caches and is used for storing private data of threads; the shared data cache is used for storing shared data among the threads; when data in the data register of the processor are accessed, data types of the data are determined according to additional flags, corresponding to the data, in physical addresses; the data types comprise the private data and the shared data; the threads corresponding to the data are determined according to the accessed data, and further data caches corresponding to the threads are accessed according to the threads and the data types, so that the data in the data caches are acquired. The embodiment of the invention is used for distinguishing the data caches and accessing the data caches.

Description

technical field [0001] The invention relates to the field of computers, in particular to a method and a processor for accessing data cache. Background technique [0002] After the processor entered the multi-core era, memory access has always been the bottleneck of system performance. The growth rate of memory system performance seriously lags behind the growth rate of processor performance, and the speed of memory access severely limits the computing speed. The current multi-core cache (buffer memory) usually involves a multi-level hierarchical structure in which L1 cache is a private cache and other levels are shared caches. [0003] Multi-core processors provide greater parallel computing capabilities and can run multiple program loads at the same time, but there will be performance interference between programs running simultaneously on the multi-core shared cache, mainly due to the core of the program data on the shared cache. The program performance is affected becaus...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06F12/08G06F12/0842
CPCG06F12/0842
Inventor 徐远超范东睿张浩叶笑春
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products