Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data packet processing oriented buffer and application method

A data message and message processing technology, which is applied in the computer field, can solve problems such as the complexity of the data message processing process, achieve the effects of reducing the number of memory copies, reducing overhead, and improving system performance

Inactive Publication Date: 2015-11-11
NAT COMP NETWORK & INFORMATION SECURITY MANAGEMENT CENT
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to provide a buffer for data message processing and its usage method, to solve the problem of complicated data message processing flow in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data packet processing oriented buffer and application method
  • Data packet processing oriented buffer and application method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] Such as figure 1 As shown, the embodiment of the present invention provides a buffer usage method oriented to data packet processing, including:

[0032] S101. Divide the hardware buffer into a header area, a receive buffer, and a send buffer; send the start position of the head area, the start position of the receive buffer, and the start position of the send buffer to the The memory directly accesses the controller. The header area belongs to a reserved area, specifically, the header area is a part reserved when allocating a buffer, and this part is not used as a receiving and sending buffer, but only used as a parameter record or to identify the characteristics of the buffer. The receive buffer is used to store received data packets. The sending buffer is used to store the processed data packets.

[0033] sending the start position of the header area, the start position of the receive buffer and the start position of the send buffer to the network coprocessor of t...

Embodiment 2

[0041] Such as figure 2 As shown, the embodiment of the present invention provides a buffer for data packet processing, including a header area, a receiving buffer and a sending buffer;

[0042] Among them, the head area, sending buffer and receiving buffer are put together and allocated at one time during initialization, where the arrow points to the head address of the receiving buffer visible to the hardware, that is, the initial position. The receiving address of the data message and the recovery address of the buffer are both this address. The header area is a reserved area, and this part is not used as a receiving and sending buffer. Only used as a parameter record or to identify the characteristics of the buffer.

[0043] The receive buffer is used to store received data packets according to continuous physical addresses, and to send its start position to the network coprocessor of the central processor; specifically, the network coprocessor of the central processor ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a data packet processing oriented buffer and an application method. The method comprises: a central processing unit (CPU) receiving a data packet, and storing the data packet in a receive buffer by using a direct memory access (DMA) controller; processing the data packet in the receive buffer and obtaining the processed data packet; and storing the processed data packet in a transmit buffer, and sending the processed data packet by using the DMA controller. In the present invention, a hardware buffer, the transmit buffer and the receive buffer are put together, and one-time allocation is performed during the initialization so as to reduce the number of times of memory copying and memory management overheads, thereby improving system performance. In addition, during data processing, the CPU does not substantially participate in buffer management work, all of which is implemented by hardware, thereby greatly simplifying the data processing flow.

Description

technical field [0001] The invention relates to the field of computers, in particular to a buffer for data message processing and a usage method. Background technique [0002] At present, after receiving a data message, it is first copied into the receiving buffer, and then the hardware buffer is released. Then the cpu reads the content of the data message from the receiving buffer for processing, and forms a sending data message with the processed result and part of the content in the receiving buffer, and the sending data message needs to be stored in a new buffer (sending buffer). After the sending data message is formed, the CPU copies the sending data message into the buffer, then starts DMA (DirectMemoryAccess, memory direct access) to send it out, and reclaims the sending buffer after sending. [0003] In the prior art, data packets need to be copied multiple times during processing, from the hardware buffer to the receiving buffer, from the sending buffer to the ph...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F13/28G06F12/08
CPCG06F13/282G06F12/0848
Inventor 周立邹昕张家琦刘谦王维晟朱小波
Owner NAT COMP NETWORK & INFORMATION SECURITY MANAGEMENT CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products