Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network training method and device, and computer readable storage medium

A training method and technology of neural network, applied in the field of training method and device of neural network, computer readable storage medium, capable of solving problems such as incompatibility

Pending Publication Date: 2021-12-24
SHANGHAI SENSETIME INTELLIGENT TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the current two types of few-sample learning methods are very similar in the optimization process, but each has a corresponding implementation method, which is not compatible with each other and has limitations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network training method and device, and computer readable storage medium
  • Neural network training method and device, and computer readable storage medium
  • Neural network training method and device, and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0078] In order to make the purpose, technical solutions and advantages of the present disclosure clearer, the present disclosure will be further described in detail below in conjunction with the accompanying drawings. All other embodiments obtained under the premise of creative labor belong to the protection scope of the present disclosure.

[0079] In the following description, references to "some embodiments" describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or a different subset of all possible embodiments, and Can be combined with each other without conflict.

[0080] In the following description, the term "first\second\third" is only used to distinguish similar objects, and does not represent a specific ordering of objects. Understandably, "first\second\third" Where permitted, the specific order or sequencing may be interchanged such that the embodiments of the disclosure described herein can be practiced ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a neural network training method and device and a computer readable storage medium, and the method comprises the steps of carrying out the iterative training of an initial network through employing a sample of each task in a first data set based on initial meta-knowledge, the initial task knowledge of each task, and a task loss function, and obtaining the task knowledge of each task, wherein the first data set is a source domain data set containing an initial task category; based on the task knowledge, the initial meta-knowledge and the meta-loss function of each task, performing iterative training on the initial network by adopting samples corresponding to various tasks in the first data set to obtain optimal meta-knowledge; and based on the optimal meta-knowledge, the task knowledge of each task and a task loss function, performing iterative training on a network corresponding to the optimal meta-knowledge for a sample of each task in the second data set to obtain the optimal task knowledge of each task and a target network corresponding to the optimal task knowledge. According to the invention, generalization of training of a small number of samples is improved.

Description

technical field [0001] The present disclosure relates to computer vision technology, in particular to a neural network training method and device, and a computer-readable storage medium. Background technique [0002] Few-shot learning is one of the important emerging research areas in computer vision. Currently, many CNN-based object detectors have achieved great success due to the rapid development of Convolutional Neural Networks (CNNs). Among them, few-sample learning aims to develop the ability of deep learning models in few-sample scenarios. The main few-shot learning methods include: a meta-learning model based on Episode training method and a transfer learning method based on Pre-train finetune. [0003] However, the current two types of few-shot learning methods are very similar in the optimization process, but they have their own corresponding implementation methods, which are not compatible with each other and have limitations. Contents of the invention [000...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/08G06N3/045
Inventor 林少波曾星宇陈大鹏赵瑞
Owner SHANGHAI SENSETIME INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products