Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, device and system for preventing cache breakdown

A caching and data request technology, applied in instrumentation, computing, electrical and digital data processing, etc., can solve problems such as database connection pool exhaustion, database impact, frequent errors, etc., to reduce avalanches, reduce access pressure, and avoid duplication. loading effect

Active Publication Date: 2019-07-16
ALIBABA GRP HLDG LTD
View PDF1 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The inventor found that the current high-QPS applications are prone to cache breakdown. Cache breakdown refers to a large number of requests concurrently penetrating the cache to the database due to cache failure or cache expiration, thereby causing a huge impact on the database and causing the database connection pool to be exhausted. Frequent errors can even lead to avalanches

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device and system for preventing cache breakdown
  • Method, device and system for preventing cache breakdown
  • Method, device and system for preventing cache breakdown

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The following will clearly and completely describe the technical solutions in the embodiments of the application with reference to the drawings in the embodiments of the application. Apparently, the described embodiments are only some of the embodiments of the application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0044] In practical applications, webcast services often have highly concurrent data loading requests. A large number of data loading requests are likely to penetrate the cache and reach the database concurrently, causing a huge impact on the database in an instant. Business paralyzed.

[0045] For example: in the online live broadcast of the Tmall Double 11 Gala, hundreds of thousands or even millions or tens of millions of people will enter the live broadcast room at the same time...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method, a device and a system for preventing cache breakdown. In the present application, a live broadcast application server provides live broadcast data support for audienceclients. live data requests for requesting the same data are combined into a live broadcast data request before a database server is accesssed, so that access logic of the data loading request is effectively controlled at the front end of a database server, invalid access to a database is reduced, repeated loading of the same live broadcast data at the same time is avoided, the access pressure ofthe database is reduced, and the phenomenon of database avalanche caused by cache breakdown can be effectively reduced.

Description

technical field [0001] The present application relates to the technical field of data access, and in particular to a method, device and system for preventing cache breakdown. Background technique [0002] At present, most devices that support data reading use cache, which is a buffer for data exchange, called Cache; when a device reads data, it first checks the required data from the cache, and returns the data directly if it exists. If it does not exist, query the database directly and then cache the query results. [0003] In practical applications, the database DB and the application QPS (query pre second, query rate per second) usually do not match. In order to support the high QPS business requirements of the application, the system usually adopts a caching mechanism, which can relieve the DB access pressure. Contents of the invention [0004] The inventor found that the current high-QPS applications are prone to cache breakdown. Cache breakdown refers to a large num...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/2455G06F16/21
CPCG06F16/217G06F16/24552
Inventor 吴小飞
Owner ALIBABA GRP HLDG LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products