Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!
Learning system and learning method
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A learning system and differential technology, applied in the field of learning systems, can solve the problems of over-specialization and over-learning of learning data features
Inactive Publication Date: 2018-07-03
DENSO IT LAB +1
View PDF0 Cites 1 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
CNN's model has a high performance ability, but on the other hand, it has been pointed out that CNN has a problem called "over-learning" because of the excessive specialization of the characteristics of the learning data.
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
no. 1 Embodiment approach
[0067] Distributed methods can be classified according to: (1) "what" to communicate, (2) "with whom" to communicate, and (3) "when" to communicate.
[0068] First of all, for the perspective of "communicating 'what content'", it includes the methods of "parallel model" and "parallel data". In the model parallel method, the model itself is dispersed among computers, and the intermediate variables of the neural network are communicated. In the data parallel method, the calculation of the model is enclosed in a single computer, and the differential value calculated by each computer is communicated.
[0069] In the data parallel method, each computer processes different data, so a large amount of data can be processed in one go. In the case of the mini-batch stochastic gradient method as a premise, it is logical to align data in a mini-batch, so the main assumption in this specification is data alignment.
[0070] Model side-by-side is very useful when dealing with huge neural ...
no. 2 Embodiment approach
[0114] Improvements to the first embodiment will be described below. The absolute value of the differential value is sometimes large in the early stage of learning. As a result, parameter updates tend to become unstable in the early stages of learning. Specifically, if the variation of the objective function is too large, the decrease speed may become small, and the value of the objective function may diverge infinitely.
[0115] Empirically, this restlessness is specific to the early stages of learning and is not a problem in the middle or second half of learning. In order to improve the unsteadiness in the early stage of learning, it is preferable to adopt the following method.
[0116] The first example is that when the parameter update unit 3 uses the value obtained by multiplying the differential value and the learning coefficient η to update the parameters, the learning coefficient η can be set to take a smaller value in the early stage of learning, and it can be incre...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The present invention provides a learning system that updates a parameter for a neural network with high speed and a learning method. The learning system includes a plurality of differential value calculators and a parameter update module to update the parameter for the neural network, wherein the plurality of differential value calculators execute following motions in a mutually unsynchronized manner: receiving a parameter of a certain time point from the parameter update module; calculating a differential value for updating the parameter according to the received parameter; and sending the differential value to the parameter update module. The parameter update module executes following motions: receiving the differential values from the differential value calculators; calculating the differential values with the plurality of differential value calculators, and updating the parameter according to the received differential values in an unsynchronized manner; and sending the updated parameter to the plurality of differential value calculators. The differential value calculators consider the passed time corresponding to the updating frequency during calculation of the differential values, and the updating frequency refers to the updated frequency of the parameter in a period from the start of a receiving time point of the parameter until the calculated differential values are used for updating the parameter by the parameter update module.
Description
technical field [0001] The invention relates to a learning system and a learning method for updating parameters used in a neural network. Background technique [0002] In the field of image recognition there is a problem known as general object recognition. This problem is a problem of estimating the type (class) of objects such as birds and cars existing in an image. In recent years, the recognition performance of the general object recognition problem has been significantly improved, which is largely brought about by the convolutional neural network (Convolution Neural Network, hereinafter referred to as CNN; for example, refer to Non-Patent Document 1). [0003] In the field of image recognition, various recognition algorithms have been proposed in the past, but as the learning data (a combination of input data and correct answers) becomes larger, CNN tends to exceed the recognition performance of other algorithms. CNN's model has a high expressive ability, but on the o...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.