Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic load allocating method for network processor based on cache and apparatus thereof

A network processor and load distribution technology, applied in data exchange networks, digital transmission systems, electrical components, etc., can solve problems such as out-of-order packets and unbalanced loads, avoid packet processing bottlenecks, avoid severe oscillation, The effect of reducing the probability of out-of-order packets

Inactive Publication Date: 2006-01-11
NAT UNIV OF DEFENSE TECH
View PDF0 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is: aiming at the unbalanced load and out-of-sequence problems caused by each PE that tends to occur under the parallel PE structure of the network processor in the prior art, the present invention provides a A cache-based network processor dynamic load distribution method and device capable of efficiently maintaining load balance and having strong message flow order preservation capabilities

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic load allocating method for network processor based on cache and apparatus thereof
  • Dynamic load allocating method for network processor based on cache and apparatus thereof
  • Dynamic load allocating method for network processor based on cache and apparatus thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0027] The technical solution of the present invention is to dynamically monitor the load status of each processing unit in the network processor, and adaptively adjust the distribution mode of the wire-speed message flow to each processing unit according to the load status, and at the same time adopt the cache structure to maintain the message to each processing unit. The allocation status of the unit, and update the allocation status by adjusting the invalidation and replacement timing of cache entries. The distribution method in the present invention is applied to the load distributor inside the network processor chip, such as figure 1 As shown, the IP packets sent from the input interface of the network processor are sent to each processing unit for processing after load adjustment in the load distribution unit, and the processed packets are sent to the o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This invention discloses a dynamic load distribution method for network processor based on cache. After a message buffer picks up the source IP address and target IP address, it computes a 16-bit flow identification value with a flow identification hash computation unit and utilizes said value to look for load distribution cache to determine which unit processes said message, if the cache is not shot, it gives the process unit number with the lightest load at present by a load computing unit real time monitoring and processing the load state as the target process unit. A dynamic load distribution device got by this method includes a flow identification hash computing unit, a load distribution cache unit, a computation unit, a message buffer unit and an issue unit.

Description

technical field [0001] The invention mainly relates to the field of network processors, in particular to a cache-based dynamic load distribution method and device for network processors. Background technique [0002] A network processor is a high-performance "Application Specific Instruction Processor" (ASIP) for network protocol processing. It is widely used in various network devices, such as router line cards, firewall devices, and network interface cards. A network processor includes multiple processing units (PE, ProcessorElement). There are usually two ways to organize processing units: parallel and pipeline structures. The present invention is aimed at parallel PE structures, such as figure 1 shown. In the parallel PE structure, each processing unit can independently complete the message processing operation without the intervention of other units, but they share the receiving and sending structure of the message flow. The network processor needs to distribute the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/56H04L12/803
Inventor 唐玉华张晓明孙志刚胡晓峰管剑波
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products