Unlock instant, AI-driven research and patent intelligence for your innovation.

Page cache method and device

A page cache and page technology, applied in the field of page cache methods and devices, can solve problems affecting page switching speed, limited cache resources, unpredictable page cache destruction time, etc., to improve user experience, efficient utilization, and increase average loading speed And the effect of page switching speed

Active Publication Date: 2016-04-20
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method cannot predict the page cache destruction time. When the user needs to switch to a certain page, the page may have been destroyed by the operating system
The second is that users use the first-in-first-out queue algorithm to cache pages. Due to limited cache resources, this method takes a long time to render complex pages, which will seriously affect the page switching speed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Page cache method and device
  • Page cache method and device
  • Page cache method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0022] figure 1 This is a flowchart of a page caching method provided in Embodiment 1 of the present invention. This embodiment is applicable to the caching of mobile terminal application pages. The method can be executed by a page caching device, and the device can use hardware and and / or software, the method specifically includes the following operations:

[0023] S110: Acquire historical access status data of the page currently accessed by the terminal.

[0024] The historical access status data of the page currently accessed by the terminal reflects the user's page access habit, and can also reflect the difference in the user's habit. Applications on existing terminals have more and more functions, but for a certain user, there may not be many functions used. For example, in map applications, users who travel frequently may use more pages of hotels and routes; Users with cars may use more navigation and route pages; some users may often open public transportation pages. ...

Embodiment 2

[0037] figure 2 A flowchart of a page caching method provided in Embodiment 2 of the present invention, such as figure 2 As shown, the method includes:

[0038] S210: Acquire historical access status data of the page currently accessed by the terminal.

[0039] S220. Calculate the priority value of the currently accessed page according to the historical access state data of the currently accessed page.

[0040]S230. When the free cache resources of the cache area are smaller than the resources of the currently accessed page, compare the priority value of the currently accessed page with the priority value of the cached page in the cache area, and release the cache according to the comparison result. Preferably, at least one cached page in the cache area with a priority value smaller than the currently accessed page priority value is added, and the resource of the currently accessed page is added to the cache area.

[0041] When the free cache resources of the cache area a...

Embodiment 3

[0049] image 3 A schematic diagram of the mechanism principle of a page caching method provided in Embodiment 3 of the present invention, such as image 3 As shown, when a user accesses a page using a terminal, historical access status data of each page accessed by the user, such as page access frequency An and average rendering time Tn of each page, are recorded and stored locally. Then, the priority value Qn=An*Tn of the page is calculated by the frequency An of the user using the page and the average rendering time Tn of the page, where n is a positive integer. Execute the caching strategy to preferentially cache pages with a high priority value. When the cache resources are insufficient, the pages with a low priority value will be released preferentially. When the user switches pages, the query in the cache area hits the corresponding cached page, and the hit cached page is directly accessed.

[0050] In the embodiment of the present invention, the cached pages in the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a page cache method and device; the method comprises the following steps: obtaining history access state data of a terminal present access page; calculating present access page priority value according to the history access state data of the present access page; determining a cache region cache page according to the present access page priority value and cache region idle cache resources. The method can cache pages according to different user habits, can effectively utilize limited cache region resources, thus improving page average loading speed and page switching speed, and improving user experiences.

Description

technical field [0001] Embodiments of the present invention relate to the field of communications technologies, and in particular, to a page caching method and apparatus. Background technique [0002] At present, the various applications in mobile terminals are becoming more and more diverse, the page layout of the application is becoming more and more complex, and the number of pages when the user is using it is increasing, which makes the running performance of the application lower and lower, which is mainly manifested in the increasing memory usage. The bigger it is, the slower the page switching is, etc. [0003] In the prior art, there are mainly two methods to solve the problem of slow page switching speed. One is to directly push the page into the system page stack, and the operating system of the mobile terminal maintains the cache of the page. The operating system will destroy some pages when the system resources are insufficient. to release resources. However, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30
Inventor 谢海洋
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More