Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Scheduling method and device, electronic equipment and readable storage medium

A scheduling method and technology of electronic equipment, applied in the direction of multi-channel program device, neural learning method, program control design, etc., can solve the problems that data privacy cannot be guaranteed and model service quality has a great impact

Active Publication Date: 2020-02-04
BEIJING DIDI INFINITY TECH & DEV
View PDF7 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the service quality of models deployed in the cloud is greatly affected by the network environment, and the privacy of data uploaded by users cannot be guaranteed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scheduling method and device, electronic equipment and readable storage medium
  • Scheduling method and device, electronic equipment and readable storage medium
  • Scheduling method and device, electronic equipment and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0093] figure 1 It is a schematic diagram of exemplary hardware and software components of an electronic device 100 provided in an alternative embodiment of the present application, wherein the electronic device 100 may be a general-purpose computer or a special-purpose computer, or may be a mobile terminal. Although only one electronic device 100 is shown in the present application, for the sake of convenience, the functions described in the present application may also be implemented in a distributed manner on multiple similar platforms to execute tasks to be processed.

[0094]For example, the electronic device 100 may include a network port 110 connected to a network, one or more processors 120 for executing program instructions, a communication bus 130, and various forms of storage media 140, such as RAM, disk, or ROM, or any combination thereof. Exemplarily, the computer platform may also include program instructions stored in ROM, RAM, or other types of non-transitory ...

no. 2 example

[0100] figure 2 A schematic flowchart of a scheduling method according to some embodiments of the present application is shown, and the scheduling method provided in the present application can be applied to the above-mentioned electronic device 100 . It should be understood that in other embodiments, the order of some steps in the scheduling method described in this embodiment may be exchanged according to actual needs, or some steps may be omitted or deleted. The detailed steps of the scheduling method are introduced as follows.

[0101] Step S210, obtaining tasks to be processed, and obtaining data to be processed collected by the data collection device.

[0102] Step S220, according to the association between different pre-stored deep learning models and different processing tasks, obtain the target learning model required by the task to be processed from a plurality of pre-stored deep learning models.

[0103] Step S230 , according to the resource utilization rate of e...

no. 3 example

[0142] Figure 8 A functional module block diagram of the scheduling device 800 in some embodiments of the present application is shown, and the functions implemented by the scheduling device 800 correspond to the steps performed by the above method. The device can be understood as the above-mentioned electronic device 100, or the processor 120 of the electronic device 100, and can also be understood as a component that realizes the functions of the present application under the control of the electronic device 100 independent of the above-mentioned electronic device 100 or the processor 120, such as Figure 8 As shown, the scheduling apparatus 800 may include an acquisition module 810 , a target learning model acquisition module 820 and a scheduling module 830 .

[0143] The obtaining module 810 is configured to obtain tasks to be processed and obtain data to be processed collected by the data collection device. It can be understood that the acquiring module 810 can be used ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a scheduling method and device, electronic equipment and a readable storage medium, and the method comprises the steps: obtaining a to-be-processed task and to-be-processed data, and obtaining a target learning model needed by the to-be-processed task from a plurality of pre-stored deep learning models according to the correlation between different pre-stored deep learning models and different processing tasks; scheduling the target learning model to the corresponding computing unit based on a pre-trained scheduling model according to the resource utilization rate of each computing unit so as to process the to-be-processed data to obtain an output result. In this way, when the processing task is executed in the end-side equipment, the scheduling model is establishedin advance, and the optimized scheduling strategy is obtained based on the real-time resource utilization rate, so that the overall execution efficiency of the equipment is improved.

Description

technical field [0001] The present application relates to the technical field of data computing and processing, and in particular, to a scheduling method, device, electronic equipment, and readable storage medium for data processing tasks based on a deep learning model. Background technique [0002] With the gradual maturity of deep learning technology, deep learning models are used in more and more application scenarios, such as face detection, behavior recognition, video content analysis, etc. The traditional method is generally to deploy the deep learning model on a cloud server with powerful performance and abundant computing resources, so that it can provide users with various services through the network. However, the service quality of models deployed on the cloud is greatly affected by the network environment, and the privacy of data uploaded by users cannot be guaranteed. Therefore, in order to perform real-time calculations locally on the device and avoid uploadin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48G06N3/04G06N3/08G06N3/063
CPCG06F9/4881G06N3/08G06N3/063G06N3/045
Inventor 唐剑徐志远刘宁林航东张法朝
Owner BEIJING DIDI INFINITY TECH & DEV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products