Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic starting-stopping and computation task dynamic allocation method for mass parallel coarse particle computation

A computing task and dynamic allocation technology, applied in the field of high-performance computing, can solve problems such as program running speed reduction, achieve the effects of reducing communication, improving parallel computing efficiency, and solving complexity errors

Active Publication Date: 2016-10-26
北京智芯仿真科技有限公司
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The result of this comparison shows that if there are many open processes for parallel computing and no measures are taken, part of the hard disk storage space may be read as virtual memory during the computing process, which will reduce the running speed of the program by more than a hundred times

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic starting-stopping and computation task dynamic allocation method for mass parallel coarse particle computation
  • Automatic starting-stopping and computation task dynamic allocation method for mass parallel coarse particle computation
  • Automatic starting-stopping and computation task dynamic allocation method for mass parallel coarse particle computation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0040] The present invention provides a method for automatic start-stop and dynamic distribution of computing tasks in parallel with massive computing coarse particles. The method includes the following steps: determining an independent parallel computing area according to the computing problem, that is, parallel coarse particles; implementing...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an automatic starting-stopping and computation task dynamic allocation method for mass parallel coarse particle computation. The method comprises the following steps: defining parallel coarse particles according to a problem computation characteristic; dynamically allocating computation tasks in the parallel coarse particles and input parameters corresponding to the computation tasks to all processes including a master process by the master process according to a file marking technology and a dynamic allocation computation task strategy; dynamically allocating memories to the processes including the computation tasks based on an automatic starting-stopping technology; and after completion of parallel computation of all the parallel coarse particles, collecting output parameters of all the processes by the master process, and combining and integrating the output parameters to obtain a final result of complete running. Through adoption of the method, communications among the processes are reduced to the maximum extent; the hard disk reading-writing bottleneck occurring since a memory peak value is greater than an available physical memory during multi-process parallel computation is avoided; meanwhile, the problem of non-equivalent complexity of computation examples is solved perfectly; and the parallel computation efficiency is increased greatly.

Description

technical field [0001] The invention relates to the technical field of high-performance computing, in particular to a method for automatic start-stop and dynamic distribution of computing tasks in parallel with massive computing coarse particles. Background technique [0002] In the fields of optimization design of electromagnetic functional materials, logging response and inversion, complex electromagnetic environment and multi-physics field coupling calculation, marine environment numerical simulation, molecular dynamics and personalized drug design and screening, etc., a large number of large-scale numerical data of the same type are required. calculate. This kind of large-scale numerical calculation has different structures due to different computing instances, resulting in unequal computational complexity of different computing instances. For such unequal massive calculations, it is necessary to design high-efficiency parallel computing methods, fully considering the co...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4881G06F9/5016
Inventor 王芬
Owner 北京智芯仿真科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products