Method and system for processing high-concurrency data requests

A data request and database technology, applied in transmission systems, electrical digital data processing, resource allocation, etc., can solve problems such as slow response time, reduced speed, and short-term system paralysis, and achieve the effect of reducing operation locks and improving processing speed.

Inactive Publication Date: 2016-11-30
SHENZHEN IDREAMSKY TECH
View PDF3 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] When the industry solves the problem of high-concurrency data processing, there are mainly the following solutions: send events to a message middleware cluster in a unified way, and then send them to clients who subscribe to messages through the use of distributed computing framework; Events are distributed to different machines for processing; sending events into the data stream and performing non-stop matching according to pre-configured patterns or rules cannot effectively solve the above problems
[0004] Customers send data requests to kill a product through the client. Too many customers send data requests fo

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for processing high-concurrency data requests
  • Method and system for processing high-concurrency data requests
  • Method and system for processing high-concurrency data requests

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] Such as figure 1 As shown, the data processing method provided by this embodiment includes: S101 load balancing the data requests sent by multiple clients to multiple server units; S102 loading the data requests from multiple server units into the corresponding A distributed memory cache unit; S103 uses asynchronous processing to process the data requests in the distributed cache unit one by one; S104 writes the asynchronously processed data requests into the database.

[0038]The data request that a plurality of clients sends is distributed to a plurality of server units by load balancing; Data request is distributed to a plurality of server units by load balancing, can reach load balancing by NGINX reverse proxy PHP in this embodiment; Multiple The data requests in the server unit are respectively loaded into the corresponding distributed memory cache unit. This embodiment uses REDIS cache and joins the REDIS CLUSTER cluster queue. The function of the queue is to allo...

Embodiment 2

[0042] Such as figure 2 As shown, the data processing method provided by this embodiment includes: S201 load balancing the data requests sent by multiple clients to multiple server units; S202 loading the data requests from multiple server units into the corresponding Distributed memory cache unit; S203 limits the data capacity of the distributed memory cache unit. During the process of loading data requests into the distributed memory cache unit, when the number of data requests exceeds the data capacity of the memory cache unit, refuse to load and give feedback; S204 adopts The asynchronous processing processes the data requests in the distributed cache unit one by one; S205 writes the asynchronously processed data requests into the database.

[0043] The data requests sent by multiple clients are distributed to multiple server units through load balancing; the data requests are load balanced through reverse proxy, and in this embodiment, load balancing can be achieved by N...

Embodiment 3

[0047] Such as image 3 As shown, the data processing method provided by this embodiment includes: S301 load balancing the data requests sent by multiple clients to multiple server units; S302 loading the data requests from multiple server units into the corresponding Distributed memory cache unit; S303 determines the quantity of the distributed memory cache, and determines the data capacity of the memory cache unit through the capacity of the above-mentioned database and the number of memory cache units; S304 limits the data capacity of the distributed memory cache unit, and loads the data request into During the process of the distributed memory cache unit, when the number of data requests exceeds the data capacity of the memory cache unit, it refuses to load and feeds back; S305 uses asynchronous processing to process the data requests in the distributed cache unit one by one; S306 processes the asynchronously processed data Request to write to the database.

[0048] The d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a data processing program control device and a control method of a prediction purpose, and especially relates to a method and system for reducing system resource occupancy caused by high-concurrency data processing. The invention discloses a method for processing high-concurrency data requests. The method comprises the steps of: sharing the data requests sent by a plurality of clients to a plurality of server units by means of load balancing; loading the data requests in the plurality of server units respectively into the corresponding distributed memory cache units; adopting an asynchronous processing mode to process the data requests in the distributed memory cache units one by one; and writing the data requests after the asynchronous processing into a database. According to the invention, operation locks of the database under the high-concurrency requests are reduced, and the processing speed of the high-concurrency request information of the database is improved.

Description

technical field [0001] The present invention relates to a data processing program control device and a control method for prediction purposes, in particular to a method and system for reducing system resource occupation caused by high concurrent data processing. Background technique [0002] As one of the high-concurrency data processing services, the seckill business has the characteristics of few types of products and a huge number of visits in a short period of time, which may cause slow response time and even short-term system crashes. [0003] When the industry solves the problem of high-concurrency data processing, there are mainly the following solutions: send events to a message middleware cluster in a unified way, and then send them to clients who subscribe to messages through the use of distributed computing framework; Distributing events to different machines for processing; sending events into data streams and performing non-stop matching according to pre-configu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04L29/08G06F9/50
CPCG06F9/5083H04L67/1001H04L67/568
Inventor 贺永华
Owner SHENZHEN IDREAMSKY TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products