Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Large data quantity batch processing system and large data quantity batch processing method

A large data volume and batch processing technology, applied in the computer field, can solve the problems of not using middleware unit resources, not optimizing the processing speed, and high pressure on the database side, so as to improve the overall concurrent processing capability, reduce pressure, and optimize data reading. Take the effect

Active Publication Date: 2013-04-03
YONYOU NETWORK TECH
View PDF5 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The existing paging technology is to implement paging technology on the database side, one is to directly use SQL statements to perform paging, for example, the first time to fetch the 1-50th record, the second time to fetch the 51-100th record, etc. By analogy, although this method achieves the loading of limited records into the memory each time, the pressure on the database side is still very high, because each query of the SQL statement scans the entire record of the result set, and the processing speed is not optimized. ;The other is to implement pagination through code, for example, in JAVA, use the ResultSet result set to perform cyclic traversal, the first time to traverse the 1-50 records and take out
It traverses the 1-100 records for the second time, but only takes out the 51-100 records. This method still has the disadvantage of pre-querying all records each time; secondly, there is a primary key that will satisfy the conditional result set by pre-finding PK, and then stored in a temporary table and numbered, and then read out the PK set in batches through the serial number, and use the PK set to query the data in the database. Although this method solves the previous problem, it needs to be batched Reading data from the database temporary table, in the case of high concurrency, the pressure on the database side is still very high, and there will be multiple connections, queries, and data network transmissions from the middleware unit to the database. In a narrowband environment, the efficiency still exists. Some bottlenecks, and no reasonable use of middleware unit resources
Finally, none of the above three solutions proposed how to further optimize the speed of data processing in a general way after loading data into the memory. There are two processes of query loading and data processing persistence, and how paging processing can automatically adapt to multiple databases, these are problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large data quantity batch processing system and large data quantity batch processing method
  • Large data quantity batch processing system and large data quantity batch processing method
  • Large data quantity batch processing system and large data quantity batch processing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In order to understand the above-mentioned purpose, features and advantages of the present invention more clearly, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0032] In the following description, many specific details are set forth in order to fully understand the present invention, but the present invention can also be implemented in other ways different from those described here, therefore, the present invention is not limited to the specific embodiments disclosed below limit.

[0033] Before explaining the large data volume batch processing system according to the present invention, the existing large data matching processing process is briefly introduced.

[0034] Such as figure 1 As shown, in general large-scale batch processing business scenarios, all processing logic and algorithms are roughly divided into the following processes: middleware initiates a request to query...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a large data quantity batch processing system which comprises a middleware unit, a first-order cache device and a second-order cache device, wherein the middleware unit is used for sending an information request to the first-order cache device, receiving a second-order paging primary key set from the second-order cache device, and sending a persistence data request to a database after querying to-be-processed data to the database according to the second-order paging primary key set and calculating the to-be-processed data; the first-order cache device is used for querying the primary key set according with the information request to the database, generating a first-order paging primary key set according to the primary key set and returning the first-order paging primary key set to the second-order cache device; and the second-order cache device is used for generating the second-order paging primary key set according to the first-order paging primary key set, and returning the second-order paging primary key set to the middleware unit. The invention further provides a large data quantity batch processing method. According to the technical scheme, the processing speed of mass data of the system can be increased greatly, the processing time of the system is shortened, and the combination property of the system is improved.

Description

technical field [0001] The present invention relates to the field of computer technology, in particular to a large data volume batch processing system and a large data volume batch processing method. Background technique [0002] In the current large-scale online transaction processing system (OLTP), the index to measure its system performance is often the processing speed of some key core algorithms in large data volume application scenarios, and the processing speed directly affects the performance of the entire system. [0003] A large-scale information system often has its own relatively complex business processing logic and business processing algorithms. When these complex business processes are used in small data volume application scenarios, efficiency issues are often ignored, because the system response speed in this scenario It is relatively fast, but in the case of a large amount of data, there may be bottlenecks in system processing performance, serious situatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30
Inventor 张成
Owner YONYOU NETWORK TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products