Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, data processing method, device and storage medium for constructing model inference network

A technology for building models and processing data, applied in the computer field, to solve problems such as unusable results, large calculation errors, and inability to optimize the overall model

Active Publication Date: 2022-07-12
BEIJING QIYI CENTURY SCI & TECH CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, the deep learning network structure is becoming more and more complex, and the network often contains thousands of computing layers. However, due to the decentralization of front-end AI development frameworks and the diversity of computing layers, mainstream inference acceleration frameworks are optimized from the network optimization compiler side to the model accelerator side. There are certain limitations
For example, the network optimization compiler often encounters unsupported AI computing layers, causing compilation failures, which in turn leads to the inability to optimize the overall model
In addition, even if the compilation is successful, sometimes the model accelerator will not be able to use the accelerated model due to excessive calculation errors after optimization of some calculation layers.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, data processing method, device and storage medium for constructing model inference network
  • Method, data processing method, device and storage medium for constructing model inference network
  • Method, data processing method, device and storage medium for constructing model inference network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] In order to make the purposes, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application. Obviously, the described embodiments It is a part of the embodiments of this application, but not all of the embodiments. Based on the embodiments in the present application, all other embodiments obtained by those of ordinary skill in the art without creative work fall within the protection scope of the present application.

[0047] figure 1 It is an application environment diagram of a method for constructing a model inference network or a data processing method in one embodiment. refer to figure 1 , the method for constructing a model inference network or the data processing method is applied to a data processing system. The data processing system inclu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present application relates to a method, data processing method, apparatus, computer equipment and storage medium for constructing a model inference network. The method includes: acquiring a deep learning network, where the deep learning network includes a plurality of network layers; acquiring test data; performing compilation and detection on each network layer according to the test data, obtaining compilation results and detection results, and according to the compilation of each network layer The results and / or detection results determine the resource allocation strategy of each network layer, build a model inference network according to the resource allocation strategy of each network layer, and obtain the data to be processed; input the data to be processed into the model inference network, and pass the resources in the model inference network through the model inference network. The network layer whose configuration strategy is to optimize the configuration strategy processes the data to be processed, and the network layer whose resource allocation strategy is the original configuration strategy through the model inference network processes the data to be processed, and obtains the processing result of the data to be processed, which improves the performance of the entire network. Data processing efficiency.

Description

technical field [0001] The present application relates to the field of computer technology, and in particular, to a method, data processing method, apparatus, device and storage medium for constructing a model inference network. Background technique [0002] At present, the network structure of deep learning is becoming more and more complex, and the network often contains thousands of layers of computing layers. However, due to the decentralization of front-end AI development frameworks and the diversity of computing layers, mainstream inference acceleration frameworks are both from the network optimization compiler side and the model accelerator side. There are certain limitations. For example, network optimization compilers often encounter unsupported AI computing layers and cause compilation failures, resulting in the failure of the overall model to be optimized. In addition, even if the compilation is successful, sometimes the model accelerator will not be able to use ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L41/0803H04L41/14H04L47/76
CPCH04L41/0803H04L41/145H04L47/76
Inventor 陈可董峰
Owner BEIJING QIYI CENTURY SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products