Mapping method for deep learning model configuration file to FPGA configuration file

A configuration file and deep learning technology, which is applied in software maintenance/management, version control, creation/generation of source code, etc., can solve problems such as limiting promotion and application scope, increasing errors, and inconvenience, so as to improve the efficiency of R&D work and reduce Possibility of error, effect of reducing setup time

Inactive Publication Date: 2018-11-30
ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, each deep learning model framework has its own independent model files. For example, Caffe uses prototxt files to save network models and caffemodel files to save model parameters; TensorFlow uses ckpt files to save network models and model parameters; MXNet uses json files to save network models. Use ndarray files to save model parameters; Pytorch uses pkl files to save network models and model parameters. Therefore, various model files cause various inconveniences during network migration
[0003] Compared with the CPU, the GPU can increase the calculation speed of the calculation-intensive programs required for deep learning. However, the characteristics of the GPU, such as high energy consumption and small cache, limit its energy efficiency improvement and application range. It has become one of the best solutions for Internet companies to improve data center server performance and reduce power consumption.
However, in the development process of deep learning heterogeneous acceleration based on FPGA, due to the different types of deep learning models, the corresponding network configuration files are different, and developers need to manually fill in different network configuration parameters for different deep learning models into the program. Header files, and for deeper deep learning models, it takes more time to complete, which will consume a lot of research and development efforts; in addition, because the above process is realized through human operations, the possibility of errors will also be greatly increased , which further affects the R&D efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mapping method for deep learning model configuration file to FPGA configuration file
  • Mapping method for deep learning model configuration file to FPGA configuration file
  • Mapping method for deep learning model configuration file to FPGA configuration file

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The core of this application is to provide a mapping method from a deep learning model configuration file to an FPGA configuration file. This mapping method can effectively reduce the time for setting a deep learning model configuration file on the FPGA, save manpower, and thereby reduce the possibility of human error. The efficiency of research and development is further improved; another core of the present application is to provide a mapping device, equipment and computer-readable storage medium from a deep learning model configuration file to an FPGA configuration file, all of which have the above-mentioned beneficial effects.

[0040] In order to make the purposes, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Obviously, the descr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a mapping method for a deep learning model configuration file to an FPGA configuration file. The method comprises the steps that according to the type of a deep learning model,a corresponding python software developer kit is called; the python software developer kit is utilized for reading a network configuration file corresponding to the deep learning model, and the network configuration file is stored in a middleware file; a library file corresponding to the FPGA is called, and the middleware file is converted into the FPGA configuration file. The mapping method caneffectively reduce the set time of the deep learning model configuration file on the FPGA, the manpower is saved, therefore the possibility of errors generated from manual operations is reduced, and the research and development efficiency is further improved. The invention further discloses a mapping device for the deep learning model configuration file to the FPGA configuration file, equipment and the computer readable storage medium, and the above effects are achieved.

Description

technical field [0001] This application relates to the field of deep learning model heterogeneous acceleration technology, in particular to a method for mapping deep learning model configuration files to FPGA configuration files, and also relates to a device, equipment and computer for mapping deep learning model configuration files to FPGA configuration files readable storage media. Background technique [0002] Deep learning is one of the fastest-growing fields in artificial intelligence, helping computers make sense of vast amounts of data in the form of images, sounds and text. In recent years, as deep learning open source tools such as Caffe, TensorFlow, MXNet, and Pytorch have matured, deep learning technology has developed rapidly. At present, deep learning is being widely used in fields such as face recognition, speech recognition, precision medicine, and driverless driving. However, each deep learning model framework has its own independent model files. For exampl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F8/71G06F8/30
CPCG06F8/30G06F8/71
Inventor 董学辉
Owner ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products