Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device of pre-fetching data of compiler

A data prefetching and compiler technology, applied in the field of data processing, can solve problems such as lack of collaborative support, overall performance impact of software algorithms, SPM cannot complete the basic functions of Cache, etc., to achieve the effect of ensuring flexibility and easy implementation

Active Publication Date: 2013-03-20
JIANGNAN INST OF COMPUTING TECH
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the above-mentioned prior art lacks the synergistic support of both software and hardware, and the overhead of its software algorithm will have a great impact on the overall performance. In terms of hardware, SPM cannot complete some basic functions of Cache, causing a large burden on programmers.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device of pre-fetching data of compiler
  • Method and device of pre-fetching data of compiler
  • Method and device of pre-fetching data of compiler

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention belong to the protection scope of the present invention.

[0022] Please refer to the attached figure 1 , is a schematic flow chart of the compiler data prefetching method provided by the embodiment of the present invention, mainly including step S101, step S102 and step S103, specifically:

[0023] S101. Provide a hardware instruction to query the local storage space divided in the software management on-chip memory SPM, where the hardware instruction includes a main storage address of data.

[0024] On the software-managed on-chip memory (Scratch Pad Memory, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method and a device of pre-fetching data of a compiler so as to achieve efficient visitation and storage optimization. The method includes providing divided local storage spaces in a storage scratch pad memory (SPM) on a hardware instruction inquiry software management sheet, wherein the hardware instruction includes a main storage address of data; reading corresponding data of the main storage address if the corresponding data of the main storage address can be inquired in the SPM local storage spaces; and carrying out un-hitting processing according to the hardware instruction so as to pre-fetch the data to the SPM local storage spaces if the corresponding data of the main storage address can not be inquired in the SPM local storage spaces. According to the method of pre-fetching data of the compiler, high efficiency and software flexibility of the hardware are fully utilized, and the efficient visitation and storage optimization are achieved with little hardware expenditure. On the basis of realization of the efficient visitation and storage optimization, the method of pre-fetching data of the compiler is achieved, and the method combines the hardware support and a simplified pre-fetching algorithm, and is based on a software and hardware coordinated management mechanism. The method is superior to a pure software algorithm in performance, and flexibility of a software algorithm is guaranteed.

Description

technical field [0001] The invention relates to the field of data processing, in particular to a compiler data prefetch method and device. Background technique [0002] Due to the complex hardware logic of the cache (Cache), the Cache structure cannot be accommodated in some multi-core structures, and it is replaced by a software-managed on-chip memory (Scratch Pad Memory, SPM) with a simple hardware structure, small footprint, and lower power consumption. While SPM brings high performance and low power consumption, it also puts forward higher requirements and challenges for programming model and compilation optimization. Cache can automatically maintain data consistency by hardware, while SPM requires software to maintain consistency and manage data movement between storage systems at different levels. On the premise of ensuring program correctness, make full use of data localization To ensure the stability and local memory space, develop calculation and memory access in p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/45G06F12/08G06F12/0862
Inventor 漆锋滨肖谦沈莉姜军王超
Owner JIANGNAN INST OF COMPUTING TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products