Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data batch processing method and device based on cluster computing, electronic device and medium

A data processing and batch processing technology, applied in the field of computing virtualization services, can solve the problems of response delay, low processing efficiency, poor processing convenience, etc., to avoid response delay, improve use efficiency, and improve user experience.

Pending Publication Date: 2019-02-22
CHINA PING AN LIFE INSURANCE CO LTD
View PDF0 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of this application is to solve at least one of the above-mentioned technical defects, especially the technical defects of low processing efficiency, poor processing convenience and response delay

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data batch processing method and device based on cluster computing, electronic device and medium
  • Data batch processing method and device based on cluster computing, electronic device and medium
  • Data batch processing method and device based on cluster computing, electronic device and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] The embodiment of this application provides a data batch processing method based on cluster computing, such as figure 1 shown, including:

[0031] Step S100, acquiring batches of data to be processed.

[0032] Specifically, the central server obtains batches of data to be processed sent by the client, wherein the central server is at the business layer and directly performs information interaction with the client, and the client provides a program of local services for the user. Applications on terminal devices such as laptops or touchscreen phones.

[0033] In step S200, each of the obtained batches of data to be processed is sent to corresponding cache servers according to a preset distribution rule, so that each cache server processes the batch of data to be processed in parallel.

[0034] Specifically, the central server at the business layer is connected to several cache servers to form a star topology server cluster, in which users can directly establish a conne...

Embodiment 2

[0041] The embodiment of the present application provides another possible implementation manner. On the basis of the first embodiment, the method shown in the second embodiment is also included, wherein,

[0042]Step S100 includes step S101 (not marked in the figure) and step S102 (not marked in the figure), wherein,

[0043] Step S101: Receive data processing information sent by the client.

[0044] Step S102: Obtain batches of data to be processed according to the data processing information.

[0045] Wherein, the data processing information includes any of the following: batches of data to be processed; data processing requests carrying batches of data to be processed; data processing requests carrying at least one storage address of batches of data to be processed.

[0046] Specifically, the central server at the business layer directly exchanges information with the client, such as receiving a request from the client terminal, then processing the received request, and t...

Embodiment 3

[0052] The embodiment of the present application provides another possible implementation manner. On the basis of the second embodiment, the method shown in the third embodiment is also included, wherein,

[0053] Step S200 includes step S202 (not marked in the figure): when the data processing information includes batches of data to be processed or data processing requests carry batches of data to be processed, determine the key value of each data to be processed in the batch of data to be processed; and according to The CRC16 algorithm determines the cache server identification code corresponding to each key value; and determines the corresponding cache server according to each cache server identification code, and sends each data to be processed to the found corresponding cache server.

[0054] Step S201 (not marked in the figure) is also included before step S202: storing the execution sequence of each data to be processed in the batch of data to be processed in terms of lo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of computer virtualization service and discloses a data batch processing method and device based on cluster computing, an electronic device and a medium. The data batch processing method based on cluster computing comprises the following steps of: obtaining data to be processed in batch; sending each data to be processed in the obtained batch of data tobe processed to a corresponding cache server according to a preset distribution rule, so that each cache server performs parallel processing on the batch of data to be processed; receiving data processing results returned by each cache service. According to the method of an embodiment of that present application, as that cache server carry out parallel processing on the batch of data to be processed, not only the use efficiency of the cache server is improve, but also the data processing response speed under the condition of large batch of data is greatly improved, the problem of response delay caused by processing the batch of data by the central server is effectively avoided, and the user experience is greatly improved.

Description

technical field [0001] The present application relates to the technical field of computing virtualization services, and in particular, the present application relates to a data batch processing method, device, electronic device and medium based on cluster computing. Background technique [0002] With the advent of the big data era, the batch processing of big data has attracted more and more attention, and has been widely used in data processing in all aspects of enterprise operations. Enterprises have more and more diverse needs for big data processing. change. [0003] Batch processing is an environment for batch data processing, which can support a whole batch of data processing processes, and is usually a batch job submitted in an automatic control process to process batch access or update of databases or files. Under the condition of perfect automatic control process, the whole batch process can be completed without manual intervention. [0004] However, to achieve ba...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50
CPCG06F9/5005
Inventor 李贤州
Owner CHINA PING AN LIFE INSURANCE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products