Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

In-network pooling resource allocation optimization method based on contribution perception in computing power network

A technology of resource allocation and optimization method, which is applied in the field of communication network, can solve the problems of low efficiency of allocation and resource allocation, achieve high resource utilization and improve performance

Pending Publication Date: 2022-04-26
TIANJIN UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem that it is difficult to allocate resources between resource pools in the prior art, and the efficiency of resource allocation is low, the present invention proposes an optimization method for pooled resource allocation in a computing power network based on contribution perception, and designs a CPN A dynamic resource pool, and based on the attention mechanism, the limited resources are placed on the resource pool that makes a greater contribution to the system, so as to allocate resources and maximize the long-term utility of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • In-network pooling resource allocation optimization method based on contribution perception in computing power network
  • In-network pooling resource allocation optimization method based on contribution perception in computing power network
  • In-network pooling resource allocation optimization method based on contribution perception in computing power network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0095] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0096] A contribution-aware network pool resource allocation optimization method in a computing power network, comprising the following steps:

[0097] S1. Construct a computing power network resource allocation system including an infrastructure layer, a resource pool layer, and a CPN scheduler layer. The resource pool layer includes several resource pools, and each resource pool is composed of resources provided by the infrastructure layer;

[0098] Such as ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an intra-network pooling resource allocation optimization method based on contribution awareness in a computing power network. The method comprises the following steps: constructing a computing power network resource allocation system comprising an infrastructure layer, a resource pool layer and a CPN scheduler layer; constructing the computing power queue length of each resource pool in the resource pool layer based on the dynamic nature of the task and the mobile equipment; respectively establishing a computing power model and a cache model in the CPN scheduler layer by using a deep reinforcement learning algorithm, and respectively constructing the models as Markov processes; constructing a utility function of the computing power network resource allocation system based on the computing power model and the cache model; constructing a near-end strategy optimization algorithm based on the attention mechanism by using a near-end strategy optimization algorithm and the attention mechanism; and solving the long-term utility function by using a near-end strategy optimization algorithm based on an attention mechanism by taking maximization of the long-term utility function as a target. According to the method, higher system effectiveness is realized, and the performance of integrated computing tasks and caches can be remarkably improved.

Description

technical field [0001] The invention belongs to the technical field of communication networks, and in particular relates to an optimization method for pooling resource allocation within a network based on contribution perception in a computing power network. Background technique [0002] The development of B5G / 6G technology will support various large-scale data-intensive applications in the next decade, such as virtual and augmented reality, brain communication interface, remote surgery, etc. At the same time, it will generate zettabytes (ZB) of digital information and require more computing power to support it. The development of 6G not only benefits from the technology brought by 5G, but also benefits from the need for more new technologies. Computing Power Network (CPN) provides an effective solution to enable the deep integration of computing power and network. CPN acceleration spreads computing power from a set of data centers to multiple network edges and end users, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04W28/08H04L67/10
CPCH04W28/0925H04L67/10
Inventor 仇超王晓飞狄筝罗韬刘铸滔任晓旭
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products