Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

High performance server architecture system and data processing method thereof

A data processing and server technology, applied in the field of server architecture, can solve the problems of low data processing efficiency and poor ability, and achieve the effect of improving IO throughput, improving processing capacity, and reducing processing delay

Active Publication Date: 2013-11-27
SUZHOU KEDA TECH
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] For this reason, the technical problem to be solved by the present invention is that the server in the prior art receives front-end data in chronological order, and uses a single thread to process data, which has low data processing efficiency and poor ability, thus proposing an efficient processing business data server architecture system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High performance server architecture system and data processing method thereof
  • High performance server architecture system and data processing method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] This embodiment provides a high-performance server architecture system, including a high-speed cache unit, a multi-task concurrent processing unit, a thread pool unit, and a data batch sending unit, and its internal implementation flow chart is as follows figure 1 shown. The cache unit allocates a cache queue in the memory, and pre-allocates the required memory space for the cache nodes in the cache queue, figure 1 The node1, node2...nodex shown are the cache nodes in the cache queue. When the business data arrives, it is directly stored in the cache nodes here, and there is no relationship between the cache nodes. Each cache node Points can be independently manipulated by business processes. The multi-task concurrent processing unit divides the data processing process of each cache node into multiple business states according to the business type of business information, such as figure 2 As shown, the life cycle of a business data node starts from the initial state,...

Embodiment 2

[0037] This embodiment provides a high-performance server architecture system, including a high-speed cache unit, a multi-task concurrent processing unit, a thread pool unit, and a data batch sending unit, and its internal implementation flow chart is as follows figure 1 shown. The cache unit allocates a cache queue in the memory, and pre-allocates the required memory space for the cache nodes in the cache queue, figure 1 The node1, node2...nodex shown are the cache nodes in the cache queue. When the business data arrives, it is directly stored in the cache nodes here, and the cache nodes are connected in series using a two-way circular linked list. The multi-task concurrent processing unit divides the data processing process of each cache node into multiple business states according to the business type of business information, such as figure 2 As shown, the life cycle of a business data node starts from the initial state, and passes through four business states: business 1...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed are a high performance server architecture system and a data processing method thereof. The high performance server architecture system comprises a cache unit, a multi-task concurrent processing unit, a thread pool unit and a data batch transmission unit. The data processing method thereof comprises distributing memory space required by businesses to cache nodes in a high-speed buffer queue in advance; calling multithreads in the thread pool unit to receive business data when the business data arrive, and storing the business data into the distributed cache nodes; dividing a data processing process of every cache node into multiple business states according to business types, and when every business state belongs to parallel relationships, calling the multithreads in the thread pool unit to perform concurrent processing on every business state; after data processing is finished, integrally packaging the data of multiple cache nodes and performing batch transmission of the data. The high performance server architecture system and the data processing method thereof solve the technical problems that a server in the prior art receives front end data in a chronological mode, a single thread is used for processing data, and the data processing efficiency is low and the data processing capacity is poor. The high performance server architecture system and the data processing method thereof are particularly applicable to servers of monitoring systems.

Description

technical field [0001] The invention relates to a server architecture, in particular to a high-performance server architecture system and method. Background technique [0002] In the video surveillance system, it often happens that the data message of a certain front end is not processed in time, resulting in abnormal business. For example, the data of each frame of the video is delayed, and the browsing interface will be frozen or blurred; the user sends a restart command, and the front end responds after a few minutes. These abnormal situations are all caused by the low efficiency of the server in processing data and the delay in the processing of real-time data. At present, the hardware configuration of the monitoring server in various application fields is very high, but in the actual application process, the processing capacity of the server may be insufficient. After using professional monitoring tools to detect, it is found that the hardware resources are still idle...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/46
Inventor 陈黎周圣强陈卫东
Owner SUZHOU KEDA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products