Optimization method and system for pre-reading descriptor in unloading engine network card and terminal

A technology of unloading engines and optimization methods, which is applied in transmission systems, digital transmission systems, data exchange networks, etc., and can solve problems such as waste of storage resources, affecting storage efficiency, and occupying more storage resources.

Active Publication Date: 2021-08-06
XIDIAN UNIV
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) In the prior art, descriptors are stored in fixed-size FIFOs. Since different SQ queues have different degrees of activity, in order to meet the requirements of descriptor storage in the design, a large capacity will be allocated to each descriptor storage resources, which will cause a lot of waste of storage resources and affect the efficiency of storage
[0006] (2) The existing technology does not fully consider the activity of each session, and stores each descriptor uniformly, which lacks flexibility
[0007] (3) Occupying too many storage resources will lead to lower processing efficiency of the entire offload engine network card

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optimization method and system for pre-reading descriptor in unloading engine network card and terminal
  • Optimization method and system for pre-reading descriptor in unloading engine network card and terminal
  • Optimization method and system for pre-reading descriptor in unloading engine network card and terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] In order to make the objectives, technical solutions and advantages of the present invention, the present invention will be further described in detail below with reference to the embodiments. It should be understood that the specific embodiments described herein are merely intended to illustrate the invention and are not intended to limit the invention.

[0061] For the problems present in the prior art, the present invention provides an optimization method for the pre-reading of the descriptor in the TCPIP unloading engine network card, which will be described in detail below with reference to the accompanying drawings.

[0062] like figure 1 As shown, the optimization of the descriptor descriptor in the TCPIP unloading engine network card provided herein includes:

[0063] S101, in the DMA receiving module of the TCP / IP unloading engine, pre-read processing is pre-read;

[0064] S102, after the pre-reading is successful, update the information of the descriptor in the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of data transmission networks, and discloses an optimization method, system and terminal for pre-reading descriptors in an unloading engine network card, and the optimization method comprises the following steps of pre-reading the descriptors in a DMA (Direct Memory Access) receiving module of a TCP/IP (Transmission Control Protocol/Internet Protocol) unloading engine; after pre-reading succeeds, updating information of descriptors in the queue, and judging whether the number of the descriptors meets the maximum storage threshold and the minimum storage threshold or not; and if yes, continuing to send the SQ pre-reading request to the DMA receiving module. The problem that too many cache resources are occupied when descriptors are stored in an existing TCP/IP unloading engine network card is solved. According to the method, dynamic storage of the descriptors in the cache region can be achieved, waste of storage resources is reduced, software can dynamically manage the maximum threshold and the minimum threshold of the storage space of each queue according to the activity degrees of different sessions, then the number of the descriptors pre-read by each queue is flexibly adjusted, and pre-reading of the SQ descriptors is more efficient and balanced.

Description

Technical field [0001] The present invention belongs to the technical field of data transmission network, and more particularly to an optimization method, system, and terminal that describes pre-reading in the unloading engine network card. Background technique [0002] Currently, today, today, today, using a regular network card to handle the TCP / IP packet, will increase the processing burden of the server. Therefore, in order to limit the data processing amount of the CPU in the network link, TCP / IP Uninstall Engine Technology Uninstall the work of TCP / IP stack to the adapter or system hardware, and implements TCP / IP through network processor and firmware and dedicated integrated circuits. accelerate. TCP / IP Unload Engine Technology makes the application system fully utilizes the network's capacity, which also reduces the cost of network while improving network performance, and gradually showing huge potential in today's Ethernet environment. [0003] In the architect...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/06H04L29/08H04L12/863G06F9/50G06F9/54G06F13/28
CPCH04L69/12H04L69/161H04L67/14H04L47/6245G06F13/28G06F9/5022G06F9/546G06F9/5016G06F2209/548
Inventor 潘伟涛祝靖源邱智亮殷建飞郑圆圆
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products