Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

429 results about "Cache access" patented technology

On-chip network system supporting cache coherence and data request method

The invention discloses an on-chip network system supporting cache coherence. The network system comprises a network interface part and a router, wherein the network interface part is connected with the router, a multi-core processor and a second level cache; a consistent state cache connected with the multi-core processor is additionally arranged in the network interface part and is used for storing and maintaining the consistent state of a data block in a first level cache of the multi-core processor; and an active directory cache connected with the second level cache is also additionally arranged in the network interface part and is used for caching and maintaining the directory information of the data block usually accessed by the first level cache. Coherence maintenance work is separated from the work of a processor, directory maintenance work is separated from the work of the second level cache, and the directory structure in the second level cache is eliminated, so that the design and the verification process of the multi-core processor are simplified, the storage cost of a chip is reduced, and the performance of the multi-core processor is improved. The invention also discloses a data request method of the system.
Owner:TSINGHUA UNIV

Handling of hard errors in a cache of a data processing apparatus

A data processing apparatus and method are provided for handling hard errors occurring in a cache of the data processing apparatus. The cache storage comprising data storage having a plurality of cache lines for storing data values, and address storage having a plurality of entries, with each entry identifying for an associated cache line an address indication value, and each entry having associated error data. In response to an access request, a lookup procedure is performed to determine with reference to the address indication value held in at least one entry of the address storage whether a hit condition exists in one of the cache lines. Further, error detection circuitry determines with reference to the error data associated with the at least one entry of the address storage whether an error condition exists for that entry. Additionally, cache location avoid storage is provided having at least one record, with each record being used to store a cache line identifier identifying a specific cache line. On detection of the error condition, one of the records in the cache location avoid storage is allocated to store the cache line identifier for the specific cache line associated with the entry for which the error condition was detected. Further, the error detection circuitry causes a clean and invalidate operation to be performed in respect of the specific cache line, and the access request is then re-performed. The cache access circuitry is arranged to exclude any specific cache line identified in the cache location avoid storage from the lookup procedure. This mechanism provides a very simple and effective mechanism for handling hard errors that manifest themselves within a cache during use, so as to ensure correct operation of the cache in the presence of such hard errors. Further, the technique can be employed not only in association with write through caches but also write back caches, thus providing a very flexible solution.
Owner:ARM LTD

C-RAN based internet content caching and preloading method and system

The invention discloses a C-RAN based internet content caching and preloading technology. The C-RAN based internet content caching and preloading technology comprises the following steps: according to classification of requested resources, querying the corresponding cache directory; if the requested content is saved in the local cache directory, directly returning the corresponding content to a user; if the requested content is not saved in the cache directory, querying the cache directory of a cloud platform server again; if the query fails, acquiring the corresponding resource through internet, and returning the resource to a user; respectively performing hot statistics to the cached and un-cached access contents; according to the access records of the user, obtaining the user preference, and analyzing the user tag composition ratio; when the hot value is up to the upper limit or the user tag composition ratio is higher than the threshold value, triggering a caching and preloading unit, and applying preloading cache to a background server. The C-RAN based internet content caching and preloading technology disclosed by the invention aims to the mobile communication development in the future, realizes the optimal content cache storage in the whole internet, increases the storage utilization rate, effectively saves the internetwork settlement cost, and improves the user experience.
Owner:CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products