Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Distributed system for large-scale deep learning inference

A distributed system and deep learning technology, applied in the direction of reasoning methods, etc., can solve the problems of poor system fault tolerance, high degree of intrusion of the host computer business, and long time consumption.

Pending Publication Date: 2022-03-08
长园视觉科技(珠海)有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] 1. Generally, a single application architecture is adopted, the system has poor fault tolerance and cannot dynamically and quickly change the computing power support
[0006] 2. Deep learning inference is a long and time-consuming process, so the system is in a high concurrency state in most cases, which can easily cause system stability to decline or even become unavailable
[0007] 3. Deep learning has a wide range of application scenarios. It not only needs to be connected with a variety of different devices, but also serves a variety of different processes in the entire project process, such as positioning, image optimization and various detections. Most of the existing technologies provide software tools The method of docking is carried out through the package, which has a high degree of coupling and a high degree of intrusion into the business of the upper computer, which will expand the range of explosions due to various errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed system for large-scale deep learning inference
  • Distributed system for large-scale deep learning inference
  • Distributed system for large-scale deep learning inference

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] Such as figure 1 with figure 2 As shown, in this embodiment, the present invention includes a service layer, a workflow layer and an infrastructure layer;

[0053]Service layer: includes inference logic business block, model training business block and public business block. The inference logic business block undertakes the business logic of system inference and judgment. The block undertakes the common logic of the inference logic business block and the model training business block;

[0054] The workflow layer: includes several access points and several detection pipelines, the access points provide the access ports of the access protocols required by the equipment, and the access ports are network ports; the detection pipelines include several backends An inference server, the back-end inference server is a computing device, each of the back-end inference servers is assigned one or more detection threads, and the back-end inference server executes the incoming dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to the distributed system for large-scale deep learning inference, tasks are decoupled step by step by adopting queues, so that the stability of a back-end detection thread is improved, and the problem of high concurrency of the system is solved. The system comprises a service layer, a workflow layer and an infrastructure layer, the service layer comprises an inference logic business block, a model training business block and a public business block, the inference logic business block bears business logic for inference and judgment of a system, and the model training business block bears business logic for system training. The public service block undertakes public logic of the inference logic service block and the model training service block; the workflow layer comprises a plurality of access points and a plurality of detection pipelines, the access points provide access ports of access protocols required by the equipment, and the detection pipelines execute detection and return results; and the infrastructure layer is used for providing basic services required by the service layer. The method is applied to the technical field of computing platform systems.

Description

technical field [0001] The invention is applied to the technical field of computing platform systems, and particularly relates to a distributed system for large-scale deep learning inference. Background technique [0002] In industrial production and inspection, it is necessary to judge the position of the product. The commonly used detection methods include photoelectric sensors to perform in-place detection and machine vision system detection. The machine vision system uses machine vision products to ingest the target according to pixel distribution and brightness, color and other information. It is converted into a digital signal and sent to a dedicated image processing system. The image processing system performs various operations on these signals to extract the characteristics of the target, and then controls the on-site equipment actions according to the results of the discrimination. [0003] However, the calculation model and processing accuracy of the in-place dete...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N5/04
CPCG06N5/041G06N5/043
Inventor 王孝栋曾成张晶
Owner 长园视觉科技(珠海)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products