Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Shared computing resource preemption method and device, user equipment and storage medium

A technology of computing resources and user equipment, applied in the direction of computing, multi-program device, program control design, etc., can solve the problems of exhaustion of computing resources, inability to execute, and failure to obtain computing resources, etc.

Pending Publication Date: 2021-02-09
HYGON INFORMATION TECH CO LTD
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Each core / thread executes the highest-priority computing task in its respective thread pool, and the high-priority tasks in the current thread pool will exhaust the computing resources (SIMD / DIMD), causing other core / thread pools to be higher than the current thread The priority task cannot be executed due to lack of computing resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Shared computing resource preemption method and device, user equipment and storage medium
  • Shared computing resource preemption method and device, user equipment and storage medium
  • Shared computing resource preemption method and device, user equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is only some embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0041] Such as figure 1 As shown, it is a method for preempting shared computing resources of a GPU multi-core processor according to the present invention, including:

[0042] Step 11, the management scheduling thread receives the priority of the thread in the thread pool corresponding to the business core or thread from at least one busin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a GPU multi-core processor shared computing resource preemption method and device, user equipment and a storage medium. The method comprises the steps that a management scheduling thread receives priorities of threads in a thread pool corresponding to service cores or threads from at least one service core or thread; the management scheduling thread judges whether the priorities of all the threads are the same or not; if so, a step of exiting the preemption processing flow is performed, otherwise, the step 3 is executed; the management scheduling thread acquires the lowest priority in the priorities corresponding to all the currently executed threads; the management scheduling thread judges whether the thread with the priority higher than the lowest priority does notacquire the computing resources or not; if not, the preemption processing flow quits; if yes, the step 5 is entered; according to the sequence of the priorities from low to high, the task of at leastone service core or thread corresponding to the lowest priority which is being executed is selected, and a suspension operation is initialized on the selected task.

Description

technical field [0001] The present invention relates to the technical field of graphic processors, in particular to a method and device for preempting shared computing resources of GPU multi-core processors, user equipment and storage media. Background technique [0002] The GPU adopts a streaming parallel computing mode. In order to increase the parallel processing efficiency of data and improve the full utilization of the underlying computing resources (SIMD / DIMD), a multi-core / multi-thread processor will be used to establish a data flow processing computing channel to perform computing tasks. command analysis, configure relevant information, and allow computing resources (SIMD / DIMD) to compute and process data. [0003] Each core / thread corresponds to a thread pool (such as figure 2 dotted line), there are multiple threads in the thread pool (one-to-one correspondence between the thread and the priority of the computing task being executed), each core / thread will select ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48G06T1/20
CPCG06F9/4881G06F9/485G06T1/20
Inventor 陈东海
Owner HYGON INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products