Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network model training method and device

A neural network model and training method technology, applied in the field of continuous learning scenarios, can solve problems such as inability to extract, inability to comprehensively extract features, and inability to obtain input data, and achieve the effect of improving classification accuracy

Pending Publication Date: 2021-08-06
BEIJING SAMSUNG TELECOM R&D CENT +1
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, for the weight regularization method, the input data of previous tasks cannot be obtained; for the expression regularization method, although a small number of input data samples belonging to the previous task can be obtained, sufficient features cannot be extracted from a small number of input data samples learning for new tasks
Therefore, the inability to comprehensively extract features to meet the needs of learning for new tasks is one of the limitations of current category incremental learning.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network model training method and device
  • Neural network model training method and device
  • Neural network model training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the objects, technical solutions, and advantages of the present disclosure more apparent, exemplary embodiments according to the present disclosure will be described in detail below with reference to the accompanying drawings. Apparently, the described embodiments are only some of the embodiments of the present disclosure, rather than all the embodiments of the present disclosure, and it should be understood that the present disclosure is not limited by the exemplary embodiments described here.

[0041] The solutions provided in the embodiments of the present application relate to neural network-based classification techniques in the field of artificial intelligence, and are specifically described through the following embodiments. It should be noted that although the following embodiments are described in the context of a classification task in an image recognition scene, the application scene of the present invention is not limited thereto, and may als...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a training method of a neural network model for image recognition, and the method comprises the steps: in a first training stage, calculating a first loss function corresponding to a first group of input image samples, and training the neural network model through the first loss function, wherein the calculation of the first loss function comprises: (1) for each input image sample in the first set of input image samples, calculating a classification item corresponding to a difference between a predicted classification label and a real classification label of the input image sample; and (2) for each input image sample in the first set of input image samples, calculating a reconstruction error term corresponding to a difference between the input image sample and its reconstructed image sample.

Description

technical field [0001] The present disclosure relates to continuous learning scenarios in the field of artificial intelligence. More specifically, the present disclosure relates to a training method and device for a neural network model for image recognition. Background technique [0002] Traditional machine learning is performed for a fixed task, that is, the dataset used to train the learning model contains training data with a fixed distribution. When a new data set (ie, a data set containing training data with a new distribution different from the fixed distribution) is input, the learning model generally needs to be retrained. After retraining, the learning model can only respond to the new data set, but cannot respond to the original data set (ie, the data set containing the data of the fixed category). This problem is called "catastrophic forgetting (Catastrophic Forgetting)" in machine learning. In fact, the "catastrophic forgetting" is the result of the "Stabilit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08G06K9/62
CPCG06N3/08G06N3/045G06F18/241
Inventor 戴彬林宙辰
Owner BEIJING SAMSUNG TELECOM R&D CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products