Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network model training apparatus and method

A neural network model and training device technology, applied in the field of neural networks, can solve problems such as unsatisfactory convergence speed, and achieve the effect of low iteration cost

Inactive Publication Date: 2016-12-07
FUJITSU LTD
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the convergence speed of these methods is still unsatisfactory when applied to training neural networks, especially when training large-scale neural networks.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network model training apparatus and method
  • Neural network model training apparatus and method
  • Neural network model training apparatus and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] Examples of the present disclosure will now be described more fully with reference to the accompanying drawings. The following description is merely exemplary in nature and is not intended to limit the disclosure, application or uses.

[0020] Example embodiments are provided so that this disclosure will be thorough, and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms, and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known structures, and well-known technologies are not described in detail.

[0021] figure 1 An example of a neural network ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural network model training apparatus and method. The apparatus comprises an iteration calculation used for performing iterative calculation on a weight of a path in a neural network model; and a determining and output unit used for, when conditions for stopping iteration are satisfied, stopping the iteration and outputting a weight of the path of the iteration this time as a final weight. The iteration calculation unit comprises a weight calculation unit used for calculating the weight of the path of the iteration this time; a correlation function calculation unit used for randomly selecting one sample from a sample set for training the neural network model, according to the weight of the path of the iteration this time, calculating a correlation function of the selected sample, maintaining correlation functions of other samples besides the selected sample in the sample set unchanged, wherein the correlation function of the sample is a function associated with a loss function of the sample; and a total correlation function calculation unit used for calculating a total correlation function of the iteration this time according to the correlation functions of the selected sample and other samples.

Description

technical field [0001] The present disclosure relates to the technical field of neural networks, and in particular to a neural network model training device and method. Background technique [0002] This section provides background information related to the present disclosure which is not necessarily prior art. [0003] Among the technical solutions for training neural networks, recent researchers have proposed several near-incremental gradient methods, including MISO (Minimization by Incremental Surrogate Optimization, using incremental substitution optimization to minimize), Prox-SDCA (Proximal Stochastic Dual Coordinated Ascent, Near-random double-coordinate ascent), Prox-SVRG (ProximalStochastic Variance Reduced Gradient, near-random variance reduction gradient) and SAG (Stochastic Average Gradient, random average gradient), all have reached a linear convergence rate. However, the convergence speed of these methods is still unsatisfactory when applied to training neura...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/02
Inventor 石自强刘汝杰
Owner FUJITSU LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products