Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

405 results about "File caching" patented technology

File Caching. By default, Windows caches file data that is read from disks and written to disks. This implies that read operations read file data from an area in system memory known as the system file cache, rather than from the physical disk.

Selective loading of client operating system in a computer network

A client station on computer network uses an operating system such as JavaOS which is permanently stored at the server rather than on storage media at the client location. JavaOS is loaded and installed at the client upon bootup of the client. The JavaOS is loaded and installed at the client upon bootup of the client. Once the basic system is booted using local firmware, and the base file systems on the network are enabled, an application can begin running, and when it needs to use a particular class file a request will be made through the file system and will be routed over to a generic file system driver, on the client, which will then determine, using a set of configured information, where this class exists; it will utilize the particular file systems available on to that booted client, whether it be NFS, or TFTP, to determine where the server is and how to retrieve that particular class file. It will go ahead and force that operation to occur and the class file will be retrieved and cached locally on the client to be used by the application. In order to avoid loading unneeded or lesser-used parts of the JavaOS from the server to the client memory at boot time, groups of classes are broken out of the monolithic image of JavaOS, as part of the Java service loader model (JSL). JSL-provided packages allow an URL prefix to be provided as part of a package's configuration information. When a class method/data is requested by the loader via the filesystem, if it is not already present, the URL prefix is used to lazily retrieve and cache that file in memory. This allows classes and data files to be delivered as needed, and significantly reduces the amount of data to be retrieved by TFTP by the client at boot time.
Owner:IBM CORP

Spatial data double cache method and mechanism based on key value structure

The invention discloses a spatial data double cache method and a mechanism based on a key value structure, and belongs to the technical field of spatial data storage and management. A double cache mechanism of memory caching and file caching is disclosed by the spatial data double cache method and the mechanism based on the key value structure, the memory caching is first level caching, uses B+tree organizing data, and is written in the file caching by adopting a caching write-back mechanism in an asynchronous mode; the file caching is second level caching, uses large files to be built, and builds caching index based on the B+tree so as to accelerate the speed of searching; and a free space of the file caching uses free space management based on the B+tree to manage. The spatial data double cache method and the mechanism based on the key value structure have the advantages of being free in key value storage mode, fast in searching speed, high in concurrency performance and the like. Storage and visiting efficiency of spatial data caching in network environment are improved, and the spatial data double cache method and the mechanism based on the key value structure can be used for caching of generic spatial data such as remote-sensing images, vector data and dynamic effect model (DEM) in a network geographic information system (GIS).
Owner:WUHAN UNIV

Base station caching method based on minimized user delay in edge caching network

The invention discloses a base station caching method based on the minimized user delay in an edge caching network, and belongs to the field of wireless mobile communication. In an edge caching network scene, a connection matrix between a user and a base station is established at first; simultaneously, a strategy that the base station caches a file is generated; a relationship matrix between the base station and the file is established; then, the average hit rate that all users in the whole network obtain the file from the base station is counted; a constraint condition is set to achieve the optimal minimum network user average delay when satisfying a reasonable user average hit rate threshold value; a traversing method is further used for solving and finding an optimal base station content storage strategy; a small base station is deployed according to the strategy; and finally, a user is connected to the small base station to obtain the file. According to the base station caching method based on the minimized user delay in edge caching network disclosed by the invention, the balance problem of the average delay and the average hit rate in a process of designing the small base station file caching strategy of the edge caching network is sufficiently considered; and thus, the purpose of minimizing the user average downloading delay when a certain user average hit rate is satisfied is realized.
Owner:BEIJING UNIV OF POSTS & TELECOMM

Transmission control method and player of online streaming media

The invention relates to a transmission control method of online streaming media, comprising the following steps: checking whether data after a current playing point of a streaming media file cached in a local buffer area fully occupy a non-speed-limit downloading area or not, downloading the streaming media file from a network side through a CDN (content distribution network) low-speed downloading mode and playing the streaming media file if the data fully occupy the non-speed-limit downloading area; or continuously downloading the streaming media file from the network side via a CDN non-speed-limit downloading mode. The non-speed-limit downloading area is a data storage area between a starting storage position and a preset non-speed-limit downloading critical line, and the low-speed downloading mode is a downloading mode with a downloading speed not higher than a code rate of the streaming media file. The invention further relates to a player of the online streaming media. The transmission control method and the player of the online streaming media can switch a transmission speed and a transmission mode according to the streaming media data receiving situation of a terminal and save the consumption of system resources and network bandwidths at a terminal side and a server side on the premise of ensuring smooth transmission and play of the streaming media file.
Owner:CHINA TELECOM CORP LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products