Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Learning system and learning method

A learning system and differential technology, applied in the field of learning systems, can solve the problems of over-specialization and over-learning of learning data features

Inactive Publication Date: 2018-07-03
DENSO IT LAB +1
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

CNN's model has a high performance ability, but on the other hand, it has been pointed out that CNN has a problem called "over-learning" because of the excessive specialization of the characteristics of the learning data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Learning system and learning method
  • Learning system and learning method
  • Learning system and learning method

Examples

Experimental program
Comparison scheme
Effect test

no. 1 Embodiment approach

[0067] Distributed methods can be classified according to: (1) "what" to communicate, (2) "with whom" to communicate, and (3) "when" to communicate.

[0068] First of all, for the perspective of "communicating 'what content'", it includes the methods of "parallel model" and "parallel data". In the model parallel method, the model itself is dispersed among computers, and the intermediate variables of the neural network are communicated. In the data parallel method, the calculation of the model is enclosed in a single computer, and the differential value calculated by each computer is communicated.

[0069] In the data parallel method, each computer processes different data, so a large amount of data can be processed in one go. In the case of the mini-batch stochastic gradient method as a premise, it is logical to align data in a mini-batch, so the main assumption in this specification is data alignment.

[0070] Model side-by-side is very useful when dealing with huge neural ...

no. 2 Embodiment approach

[0114] Improvements to the first embodiment will be described below. The absolute value of the differential value is sometimes large in the early stage of learning. As a result, parameter updates tend to become unstable in the early stages of learning. Specifically, if the variation of the objective function is too large, the decrease speed may become small, and the value of the objective function may diverge infinitely.

[0115] Empirically, this restlessness is specific to the early stages of learning and is not a problem in the middle or second half of learning. In order to improve the unsteadiness in the early stage of learning, it is preferable to adopt the following method.

[0116] The first example is that when the parameter update unit 3 uses the value obtained by multiplying the differential value and the learning coefficient η to update the parameters, the learning coefficient η can be set to take a smaller value in the early stage of learning, and it can be incre...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a learning system that updates a parameter for a neural network with high speed and a learning method. The learning system includes a plurality of differential value calculators and a parameter update module to update the parameter for the neural network, wherein the plurality of differential value calculators execute following motions in a mutually unsynchronized manner: receiving a parameter of a certain time point from the parameter update module; calculating a differential value for updating the parameter according to the received parameter; and sending the differential value to the parameter update module. The parameter update module executes following motions: receiving the differential values from the differential value calculators; calculating the differential values with the plurality of differential value calculators, and updating the parameter according to the received differential values in an unsynchronized manner; and sending the updated parameter to the plurality of differential value calculators. The differential value calculators consider the passed time corresponding to the updating frequency during calculation of the differential values, and the updating frequency refers to the updated frequency of the parameter in a period from the start of a receiving time point of the parameter until the calculated differential values are used for updating the parameter by the parameter update module.

Description

technical field [0001] The invention relates to a learning system and a learning method for updating parameters used in a neural network. Background technique [0002] In the field of image recognition there is a problem known as general object recognition. This problem is a problem of estimating the type (class) of objects such as birds and cars existing in an image. In recent years, the recognition performance of the general object recognition problem has been significantly improved, which is largely brought about by the convolutional neural network (Convolution Neural Network, hereinafter referred to as CNN; for example, refer to Non-Patent Document 1). [0003] In the field of image recognition, various recognition algorithms have been proposed in the past, but as the learning data (a combination of input data and correct answers) becomes larger, CNN tends to exceed the recognition performance of other algorithms. CNN's model has a high expressive ability, but on the o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04
CPCG06N3/045G06N3/084G06N3/08G06F17/10
Inventor 佐藤育郎藤崎亮野村哲弘大山洋介松冈聪
Owner DENSO IT LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products