Compiling method for optimizing neural network model, execution method , and related product

A neural network model and compiling method technology, applied in the field of optimizing the compiling method of neural network model, equipment and computer program products, can solve the problems of large amount of data IO and insufficient memory access, so as to relieve the memory access pressure, Improve processing efficiency and simplify the effect of neural network models

Pending Publication Date: 2021-10-01
SHANGHAI CAMBRICON INFORMATION TECH CO LTD
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the increase in the network level has also brought about problems such as a large amount of data IO and insufficient memory access.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Compiling method for optimizing neural network model, execution method , and related product
  • Compiling method for optimizing neural network model, execution method , and related product
  • Compiling method for optimizing neural network model, execution method , and related product

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The following will clearly and completely describe the technical solutions in the embodiments of the present disclosure with reference to the drawings in the embodiments of the present disclosure. Obviously, the described embodiments are part of the embodiments of the present disclosure, not all of them. Based on the embodiments in the present disclosure, all other embodiments obtained by those skilled in the art without making creative efforts fall within the protection scope of the present disclosure.

[0022] It should be understood that the terms "first", "second", "third" and "fourth" in the claims, specification and drawings of the present disclosure are used to distinguish different objects, rather than to describe a specific order . The terms "comprising" and "comprises" used in the specification and claims of this disclosure indicate the presence of described features, integers, steps, operations, elements and / or components, but do not exclude one or more other...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a compiling method and equipment for optimizing a neural network model and a computer program product, and further discloses a method for executing the neural network model on a heterogeneous processing system and the heterogeneous processing system. The equipment may be implemented as a computing device included in a combined processing device, which may also include an interface device and other processing devices. The computing device interacts with other processing devices to jointly complete computing operation specified by a user. The combined processing device can further comprise a storage device, and the storage device is connected with the computing device and the other processing devices and used for storing data of the computing device and the other processing devices. According to the scheme disclosed by the invention, an optimization scheme of a remodeling layer in the neural network model is provided, the off-chip memory access bandwidth can be effectively reduced, the memory access pressure is relieved, and the processing efficiency of a machine is improved.

Description

technical field [0001] The present disclosure relates generally to the field of data processing. More specifically, the present disclosure relates to a compiling method, device and computer program product for optimizing a neural network model. The present disclosure also relates to a method for executing a neural network model on a heterogeneous processing system and the heterogeneous processing system. Background technique [0002] At present, deep learning (Deep Learning) has become an important branch of machine learning, and it is also vigorously promoting the development of artificial intelligence (AI). The core technology of deep learning - deep neural network (DNN) has been widely used in many industries. [0003] In order to improve the expressive ability of neural network models, DNN is constantly developing towards deeper or wider network scales. However, the increase in network levels has also brought about problems such as a large amount of data IO and insuff...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/063G06F15/78
CPCG06N3/063G06F15/7817G06N3/045
Inventor 不公告发明人
Owner SHANGHAI CAMBRICON INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products