Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache memory management device and dynamic image system and method using the cache memory management device

A dynamic image and memory technology, applied in the management technology field aiming at reducing cache misses, can solve the problem of high cache miss rate

Active Publication Date: 2018-03-09
MEDIATEK INC
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It has been proved by simulation experiments that if the current cache image configuration is used, the problem of high cache miss rate often occurs when the motion compensation program of multiple reference pictures is performed on the decoding side

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache memory management device and dynamic image system and method using the cache memory management device
  • Cache memory management device and dynamic image system and method using the cache memory management device
  • Cache memory management device and dynamic image system and method using the cache memory management device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] Firstly, the so-called present invention below is used to refer to the inventive concept presented by these embodiments, but its scope is not limited by these embodiments themselves.

[0025] A specific embodiment according to the present invention is a cache memory management device, and a cache memory associated with it is used to temporarily store a reference data required for processing a data. The cache memory management device includes an analysis module and a control module. The analysis module is used for generating a cache miss analysis information related to the cache memory when processing the data. The control module is used for determining an index content allocation method of the cache memory according to the cache miss analysis information. A functional block diagram of an application example of the cache memory management device is shown in Figure 3A .

[0026] At Figure 3A In the presented example, the cache memory management device 36 includes an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a signal processing system applied to dynamic images, which includes a signal processing module, a cache memory, an analysis module and a control module. The signal processing module is used for performing a signal processing program on a dynamic image data. The cache memory is used for temporarily storing a reference data required by the signal processing program when processing the dynamic image data. The analysis module is used for generating a cache miss analysis information related to the signal processing program and the cache memory. The control module is used for determining an index content allocation method of the cache memory according to the cache miss analysis information.

Description

technical field [0001] The present invention is related to cache management techniques, and in particular to management techniques aiming at reducing cache-misses. Background technique [0002] In computer systems, cache memory is used to temporarily store a small amount of data that has just been used or will be used by the processor in the near future. Compared with the main memory with larger capacity, the cache memory can access data faster, but the hardware price is higher. Generally speaking, the main memory is realized by dynamic random access memory (DRAM), and the cache memory is realized by static random access memory (SRAM). When a specific piece of data is needed, the processor will first look for it in the cache memory, and if the piece of data cannot be found, it will turn to the main memory for looking for it. [0003] The cache memory includes multiple cache lines for storing data fetched from the main memory. Each cache column has its own label (tag), ind...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/06G06F12/0871G06F12/0866
Inventor 林和源
Owner MEDIATEK INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products