Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Server multithread parallel data processing method and load balancing method

A data processing and server-side technology, applied in the field of computer networks, can solve problems affecting system performance, data processing time inconsistency, data processing thread pool thread idle, etc., to achieve the effect of not consuming computer resources, simple calculation, and strong application value

Inactive Publication Date: 2014-12-24
南京汇承科技有限公司
View PDF2 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] After the system receives data, if the received data is allocated randomly or simply, due to the different allocation of threads to computer time slices and data content, the processing time of each data will be inconsistent, and the random allocation will cause some problems in the data processing thread pool. The thread is idle, and some threads discharge a large amount of data to be processed in the data area, which seriously affects the performance of the system and affects the concurrent processing capability of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Server multithread parallel data processing method and load balancing method
  • Server multithread parallel data processing method and load balancing method
  • Server multithread parallel data processing method and load balancing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The technical solution of the present invention will be described in detail below in conjunction with the drawings:

[0026] The idea of ​​the present invention is to allocate reasonable data to each data processing thread according to the relative idle rate of each data processing thread when distributing data processed in parallel by multiple threads, thereby improving the system’s parallel processing and intensive calculation of massive data Ability.

[0027] figure 1 Shows an example of server-side multi-threaded parallel data processing. In this example, when the process starts, the server starts a data receiving thread R, and creates a thread pool containing multiple data processing threads D, and a data distribution thread S, open up a dynamic storage area BM at the same time, and open up a dynamic storage area SM and a counter C with an initial value of 0 for each data processing thread D in the thread pool; the data receiving thread R is only responsible for receivi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a server multithread parallel data processing method which comprises the following steps: step 1, when server threads are started, starting one data receiving thread, one data distribution thread and one data processing thread pool containing multiple data processing threads, meanwhile opening a dynamic storage region used for storing the received client data, and dynamically managing the storage region according to an FIFO (First In First Out) mode; step 2, enabling the data receiving thread to intercept and receive data packets transmitted by a client by virtue of a socket, directly putting the data packet into the data storage region and then immediately returning to continue to intercept and receive client data; step 3, polling the dynamic storage region by virtue of the data distribution thread.

Description

[0001] This application is a divisional application, the application number of the original application is CN201210315602.7, the application date: 2012.08.31, the title of the invention: a load balancing method in multithreaded parallel processing of massive data. Technical field [0002] The invention relates to multithreaded parallel processing of massive data, in particular to a server-side multithreaded parallel data processing method and a load balancing method in massive data multithreaded parallel processing, belonging to the technical field of computer networks. Background technique [0003] In the large-scale deployment of terminal device access and massive data concurrent processing system, the system receives and processes a very large amount of data per unit time. After the system receives it, a new thread is generated immediately to process the received data. This method makes the system concurrent The performance efficiency will be significantly reduced. This is becau...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50H04L29/08
Inventor 周惠彭建华高红民
Owner 南京汇承科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products