Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data caching and acceleration algorithm based on user relation degree

A technology of data caching and relational degree, applied in database indexing, structured data retrieval, electronic digital data processing and other directions, it can solve the problems affecting the user experience of access, etc., and achieve high-performance data fast access, reduce data scale, and improve performance. Effect

Pending Publication Date: 2021-11-16
深圳市杰云智联科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, when the existing database is accessed by users, it does not obtain different cache bandwidth support according to the degree of user relationship. As a result, all users obtain the same cache bandwidth support, which greatly affects the experience of accessing users.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data caching and acceleration algorithm based on user relation degree
  • Data caching and acceleration algorithm based on user relation degree
  • Data caching and acceleration algorithm based on user relation degree

Examples

Experimental program
Comparison scheme
Effect test

example

[0037] Example (outer sort):

[0038] Assuming that the relationship degree data between friends in the storage database is divided into several sub-files, it is necessary to sort all the data from large to small (that is, first output the largest, and then output the second largest, until the order of all elements is obtained).

[0039] 1) Read each file block in turn, sort the current file block in memory (apply appropriate internal sorting algorithm), and write the sorted results directly to the external storage file (write to different sub-files respectively), At this point, each block of files is equivalent to an ordered queue arranged from large to small;

[0040] 2) Next, multi-way merge sorting is performed, and a large top heap of multiple elements is established in the storage database (note that the requirement here is not only to obtain topK, but also to output in order from large to small, so no small top heap), that is, the value of each node is greater than or ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data caching and acceleration algorithm based on user relation degrees, and particularly relates to the technical field of data caching and acceleration, the algorithm specifically adopts the following steps to cache and accelerate related data: (1) performing external sorting on friend relation degrees, and establishing a dictionary tree; (2) endowing each independent individual with the friend relationship with the same authority of providing a data downloading function to the outside; (3) using the feature matrix to store the relationship degree between friends of the user; (4) partitioning the relationship degree data among the friends in the storage database; and (5) creating a unique index to optimize the relationship degree data among the friends in the storage database. According to the invention, not only can the performance of the storage database be effectively improved, but also high-performance quick data access can be provided, so that any node except the node for externally providing a data downloading function can obtain different cache bandwidth support according to the own data width.

Description

technical field [0001] The present invention relates to the technical field of data caching and acceleration, and more specifically, the present invention relates to a data caching and acceleration algorithm based on user relationship degree. Background technique [0002] Configuring a relatively large memory for the database can effectively improve database performance. Because the database will set aside an area in memory as a data cache during operation. Normally, when a user accesses the database, the data is first read into this data cache. When the user needs to access this data next time, it will read from this data cache. Because reading data in the data cache is hundreds of times faster than reading data on the hard disk. Therefore, expanding the memory of the database server can effectively improve the performance of the database, especially when operating a large database. [0003] However, when the existing database is accessed by users, it does not obtain di...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/21G06F16/22G06F16/2455G06F8/30
CPCG06F16/211G06F16/2246G06F16/2291G06F16/24552G06F8/315
Inventor 董立武杨进
Owner 深圳市杰云智联科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products