Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Forward reasoning method and device for neural network, equipment and storage medium

A neural network and forward reasoning technology, applied in the field of parallel computing, can solve problems such as time-consuming, low efficiency, and low utilization of hardware resources

Active Publication Date: 2019-06-21
IFLYTEK CO LTD
View PDF11 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the present application provides a neural network forward reasoning method, device, equipment and readable storage medium to solve the problems of long time-consuming, low efficiency and low utilization of hardware resources in existing reasoning schemes. Its technical scheme is as follows:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Forward reasoning method and device for neural network, equipment and storage medium
  • Forward reasoning method and device for neural network, equipment and storage medium
  • Forward reasoning method and device for neural network, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0084] The technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only some of the embodiments of the present application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0085] The inventor found in the process of realizing the invention that: the existing inference scheme creates an inference instance and an inference engine for the entire neural network, that is, the existing inference scheme realizes the inference of the entire neural network based on an inference instance and an inference engine ,specific:

[0086] Firstly, construct a calculation graph according to the parameters of the neural network to be inf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a forward reasoning method and device for neural network, equipment and a storage medium The method comprises the following steps of: dividing the target neural network into a plurality of sub-networks, wherein any one sub-network comprises at least one hidden layer of the target neural network, creating inference instances and inference engines corresponding to the plurality of sub-networks respectively on hardware equipment of the inference platform, and performing forward inference on the target neural network based on the inference instances and inference engines corresponding to the plurality of sub-networks respectively. One inference engine is only responsible for a part of hidden layers of the neural network, and multiple data inputs can be executed in different inference engines in parallel at the same time, so that the forward inference method provided by the invention has relatively high inference efficiency and data throughput, and hardware resourcesof an inference platform are fully utilized.

Description

technical field [0001] The present application relates to the technical field of parallel computing, and more specifically, relates to a neural network forward reasoning method, device, equipment and storage medium. Background technique [0002] The forward inference of neural network refers to creating an inference instance and an inference engine on the inference platform for the neural network to be inferred. The inference engine operates on each layer of the neural network based on the input data and inference instances of the input layer of the neural network. [0003] The current inference scheme is: create an inference instance for the neural network to be inferred, and create an inference engine in the inference instance, the inference engine receives input data, and performs operations on each layer of the entire neural network in sequence based on the inference instance, that is, The operation of an input data on different layers is strictly serialized, and differe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N5/04
Inventor 刘凯吕亚飞张致江李必然刘远东
Owner IFLYTEK CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products