Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Decentralized Device Collaborative Deep Learning Inference Method

A technology of deep learning and equipment, which is applied in the field of artificial intelligence and edge computing, can solve the problems that the input data of edge equipment is similar or even repeated, and the edge equipment cannot be fully utilized, so as to reduce the complexity of the model and the size of the intermediate result Effect

Active Publication Date: 2022-07-22
BEIHANG UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] These methods can accelerate the deep learning model in a certain way, but they cannot make full use of the characteristics of the interconnection between edge devices, nor do they take into account that the input data of edge devices that are likely to perform intelligent tasks within a period of time is relatively similar or even repeated.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Decentralized Device Collaborative Deep Learning Inference Method
  • A Decentralized Device Collaborative Deep Learning Inference Method
  • A Decentralized Device Collaborative Deep Learning Inference Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The present invention mainly has two steps: a step of selecting a decentralized device, and a step of calculating a distributed neural network based on a cache.

[0017] Decentralized equipment selection

[0018] Because edge scene devices are generally heterogeneous, that is, there are different computing performance, network performance and storage performance, it is necessary to reasonably select cooperative devices before initiating collaborative intelligent computing tasks. Each device can be used as a task initiating device or a task cooperation device. like figure 1 As shown, in order to let other devices know their existence, each device will register its own IP address, port number and other device performance information in the registration center, and establish a stable connection under the condition of ensuring network reliability. The task initiating device obtains the information of other devices through the registration center, first screen out the devi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for distributed deployment of a cache-based deep neural network on decentralized edge devices. The method first divides the neural network, and prunes the previous layer of neural network at the division, then calculates a part of the deep neural network in the task initiating device, transmits a small amount of intermediate results to other edge devices, and calculates the remaining part, In addition to caching and reusing the intermediate results of the edge device neural network, the cache can be shared between different devices, thereby reducing the delay of edge intelligent applications and reducing the performance requirements of the neural network on edge devices, especially when similar data are initiated on the edge side. When intelligent tasks are requested, the amount of repeated calculations can be reduced, the performance requirements of deep learning on devices can be reduced, and computing resources in edge scenarios can be fully utilized.

Description

technical field [0001] The invention relates to the field of artificial intelligence and edge computing in computer science, in particular to a deep learning inference method combining decentralized equipment collaborative computing and caching. Background technique [0002] As an emerging computing paradigm, edge computing aims to utilize the computing and communication resources of edge devices to meet the needs of users for real-time response to services, privacy and security, and computing autonomy. Driven by the rapid development of algorithms, computing power and big data, as the most active field in artificial intelligence, deep learning has made significant progress in many fields. With the development of the Internet of Things and cyber-physical systems (CPS), new applications such as autonomous driving, intelligent drone formation, and intelligent robot clusters have driven the integration of edge computing and artificial intelligence, and promoted the emergence an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/50G06N3/04G06N3/063G06N3/08
CPCG06F9/5072G06N3/063G06N3/08G06N3/045
Inventor 白跃彬胡传文王锐刘畅汪啸林江文灏程琨
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products