Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Optimization method, system and terminal for descriptor pre-reading in offload engine network card

A technology of offloading engine and optimization method, applied in transmission system, multi-program device, inter-program communication, etc., can solve the problems of low processing efficiency of offloading engine network card, insufficient consideration of activity level, waste of storage resources, etc.

Active Publication Date: 2022-06-28
XIDIAN UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) In the prior art, descriptors are stored in fixed-size FIFOs. Since different SQ queues have different degrees of activity, in order to meet the requirements of descriptor storage in the design, a large capacity will be allocated to each descriptor storage resources, which will cause a lot of waste of storage resources and affect the efficiency of storage
[0006] (2) The existing technology does not fully consider the activity of each session, and stores each descriptor uniformly, which lacks flexibility
[0007] (3) Occupying too many storage resources will lead to lower processing efficiency of the entire offload engine network card

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optimization method, system and terminal for descriptor pre-reading in offload engine network card
  • Optimization method, system and terminal for descriptor pre-reading in offload engine network card
  • Optimization method, system and terminal for descriptor pre-reading in offload engine network card

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.

[0061] In view of the problems existing in the prior art, the present invention provides an optimization method for pre-reading descriptors in a TCPIP offload engine network card. The present invention is described in detail below with reference to the accompanying drawings.

[0062] like figure 1 As shown, the optimization for descriptor pre-reading in the TCPIP offload engine network card provided by the embodiment of the present invention includes:

[0063] S101, in the DMA receiving module of the TCP / IP offload engine, pre-reading the descriptor;

[0064] S102, after successful pre-reading, update the informa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of data transmission network, and discloses an optimization method, system and terminal for pre-reading descriptors in an offloading engine network card, and performs pre-reading processing on descriptors in a DMA receiving module of a TCP / IP offloading engine; After pre-reading is successful, update the descriptor information in the queue, and judge whether the number of descriptors meets the maximum and minimum storage thresholds; if so, continue to send the SQ pre-reading request to the DMA receiving module. The invention solves the problem of occupying too many cache resources when storing descriptors in the existing TCP / IP unloading engine network card. The present invention can realize the dynamic storage of descriptors in the buffer area, reduce the waste of storage resources, and the software can dynamically manage the maximum and minimum thresholds of each queue storage space according to the activity of different sessions, and then flexibly adjust the pre-reading of each queue The number of descriptors makes the prefetching of SQ descriptors more efficient and balanced.

Description

technical field [0001] The invention belongs to the technical field of data transmission networks, and in particular relates to an optimization method, system and terminal for pre-reading descriptors in a network card of an offload engine. Background technique [0002] At present, when Ethernet is widely used today, using a common network card to process TCP / IP data packets will increase the processing burden of the server. Therefore, in order to limit the data processing volume of the CPU in the network link, the TCP / IP offload engine technology offloads the work of the TCP / IP stack to the adapter or system hardware, and implements TCP / IP through network processors and firmware and application-specific integrated circuits. accelerate. The TCP / IP offload engine technology enables the application system to make full use of the network capacity, which improves the network performance and reduces the cost of the network. It has gradually shown great potential in today's Ethern...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L69/12H04L69/16H04L67/14H04L47/62G06F13/28G06F9/54G06F9/50
CPCH04L69/12H04L69/161H04L67/14H04L47/6245G06F13/28G06F9/5022G06F9/546G06F9/5016G06F2209/548
Inventor 潘伟涛祝靖源邱智亮殷建飞郑圆圆
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products