Front-end system for performing mass data interaction based on distributed message queue

A message queue and front-end system technology, which is applied in the field of front-end systems based on distributed message queues for massive data interaction, can solve problems such as bottlenecks in processing capabilities, inability to expand processing capabilities, and lack of data receiving and sending, and achieve easy maintenance, Reduce daily operating pressure and save manpower

Active Publication Date: 2018-02-16
STATE GRID ZHEJIANG ELECTRIC POWER +2
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The original system of the front-end system lacks a unified data caching mechanism for receiving and sending data, so there will be a bottleneck in processing capacity when processing massive data, and it is impossible to simply expand the processing capacity by adding processing nodes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Front-end system for performing mass data interaction based on distributed message queue

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The technical solution of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0017] Such as figure 1 As shown, this technical solution includes a communication gateway cluster for maintaining various communication channel links of field terminals, a front-end processor connected to the communication gateway cluster for message sending and receiving scheduling, and a data processing unit connected to the front-end processor. A message queue cluster for queue processing; the message queue cluster is connected to the application cluster and the data layer, and the message queue cluster plays the role of a data bus inside the front-end system, and the uplink and downlink data are first inserted into the distributed message queue, and each processing node Obtain a corresponding amount of data from the message queue for processing according to its own processing capabilities.

[0018] Wherein, the communication ga...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a front-end system for performing mass data interaction based on a distributed message queue, and relates to a front-end system. The original system data receiving and sending of a front-end processor lacks a unified data cache mechanism, thereby being subjected to a bottleneck of processing ability while processing mass data. The front-end system comprises a communication gateway cluster used for maintaining various communication channel links of an on-site terminal, a front-end processor connected with the communication gateway cluster and used for scheduling message receiving and sending, and a message queue cluster connected with the front-end processor and used for performing queue processing on data; the message queue cluster is connected with an application cluster and a data layer, the message queue cluster functions as a data bus in the front-end system, uplink and downlink data are inserted in the distributed message queue at first, each processing nodeobtains a corresponding number of data from the message queue according to the own processing ability for processing. The technical scheme solves the cache difficulty of mass data and the linear extension problem of the processing ability of the front-end processor system, avoids data loss and improves the load balancing ability of the front-end processor system.

Description

technical field [0001] The invention relates to a front-end system, in particular to a front-end system for mass data interaction based on a distributed message queue. Background technique [0002] The number of collection terminals of the electricity collection system in Zhejiang Province has reached more than 3 million, and the number of collection users has reached more than 25 million. Every day, the peak value of downlink request data and terminal uplink data generated by the main station and background applications will exceed hundreds of millions. The original system of the front-end processor passively receives such huge downlink request data, so it is necessary to control the maximum value of the queue inside each node program. In order to avoid memory overflow exceeding the processing capacity of the program, it can only be selectively discarded; When a system receives a large amount of upstream data that exceeds the processing capacity of the program, it will adop...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/66H04L29/06H04L29/08G06F9/54
CPCH04L12/66H04L63/10H04L67/1004H04L67/1097G06F9/544G06F9/546H04L67/568
Inventor 裴旭斌蒋鸿城裘炜浩蒋锦霞方舟杨杰叶方彬王明
Owner STATE GRID ZHEJIANG ELECTRIC POWER
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products