Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Message Passing Method Based on Shared Memory

A technology of message passing and shared memory, applied in the field of message passing based on shared memory, can solve problems such as increasing message delay, and achieve the effects of reducing message delay, fast message transmission, and improving communication efficiency

Active Publication Date: 2021-07-09
江苏未来智慧信息科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In some occasions that require rapid response, the message delay is increased

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Message Passing Method Based on Shared Memory
  • A Message Passing Method Based on Shared Memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] Such as figure 1 A message delivery method based on a shared memory is shown, including using the shared memory as a data storage carrier to provide rapid message delivery services for both message delivery parties;

[0031] The messaging service process is completed by calling the API from the server's business process;

[0032] Encapsulate the shared memory into a message channel and mark the message channel;

[0033] When performing message read and write operations, use the batch read and write function to read and write multiple pieces of data at one time;

[0034] When executing the read operation of the message, by judging the mark of the channel, obtain whether there is data in the current channel: if there is data, read the data; if there is no data, skip the current channel, read and judge the mark of the next channel;

[0035] For scenarios where messages cannot be lost, messages that cannot be processed in time and cannot be discarded are sent to the distr...

Embodiment 2

[0061] The difference between Example 2 and Example 1 is that, as figure 2 As shown, Embodiment 2 is a scenario where the present invention is used for service communication of a NOSQL database, and the specific workflow is as follows:

[0062] The client can be a listener for cross-network services, a console accessed by the same host, and a program API accessed by the same host. When each client initializes, it initializes its own read channel according to its own client id.

[0063] The server receives a data request from a channel, and writes the message back to the channel of the corresponding client after processing.

[0064] figure 2 Among them, Dblistener is the database listener, DbConsole is the database console, Db API is the database application program interface, and DbServer is the database server.

[0065] The message transmission method based on the shared memory of the present invention solves the technical problem of high-speed data transmission between p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a message delivery method based on shared memory, which belongs to the technical field of message middleware, and includes using shared memory as a data storage carrier, and the message delivery service process is completed by calling an API from a server business process; the shared memory is encapsulated into a message channel , to mark the message channel; when performing message read and write operations, use the batch read and write function to obtain whether the current channel has data by judging the channel mark. For the scene where the message cannot be lost, it will not be processed in time and cannot be discarded The message is sent to the distributed message system kafka for storage, which solves the technical problem of high-speed data transfer between processes. The present invention uses shared memory as a data storage carrier. Encapsulate the shared memory into a message channel, cooperate with system semaphore, memory lock, memory mark and timing control, and can transfer data between processes at high speed. The speed of message passing between processes is the speed of memory copying.

Description

technical field [0001] The invention belongs to the technical field of message middleware, and relates to a message delivery method based on shared memory. Background technique [0002] Kafka was originally developed by Linkedin. It is a distributed, partition-supporting, replica-based distributed messaging system based on zookeeper coordination. Its biggest feature is that it can process large amounts of data in real time to meet Various demand scenarios: such as hadoop-based batch processing system, low-latency real-time system, storm / Spark streaming processing engine, web / nginx log, access log, message service, etc., written in scala language, contributed by Linkedin in 2010 Gave to the Apache Foundation and became a top open source project. [0003] In terms of message middleware, there are many open source products on the market, such as Kafka based on distributed and disk storage, rabbitMQ based on memory, rocketMQ, redis and so on. In most of the backend solutions a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/54
CPCG06F9/542G06F9/544G06F9/546G06F2209/548
Inventor 石永恒王凤雷王锋平林世颖时春
Owner 江苏未来智慧信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products