Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network training method and device, electronic equipment and storage medium

A technology of deep neural network and training method, which is applied in the field of deep neural network training method, electronic equipment and storage medium, and device, and can solve the problems of limited improvement of neural network classification or recognition results, so as to improve the importance and accuracy , to avoid misclassification effects

Active Publication Date: 2019-05-28
BEIJING SANKUAI ONLINE TECH CO LTD
View PDF3 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the applicant found through research on the neural network using Center loss as the loss function in the prior art that if there are samples with larger noise or weaker discrimination in the training samples of the training neural network, using the existing The model trained by the loss function is tested, and the improvement of the classification or recognition results of the trained neural network is limited.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network training method and device, electronic equipment and storage medium
  • Deep neural network training method and device, electronic equipment and storage medium
  • Deep neural network training method and device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0024] A deep neural network training method disclosed in this embodiment, such as figure 1 As shown, the method includes: step 110 and step 120.

[0025] Step 110, acquiring several training samples with preset class labels.

[0026] Before training the neural network, it is first necessary to obtain several training samples with preset class labels.

[0027] Depending on the specific application scenario, the form of the training samples is different. For example, in the application of work clothes recognition, the training samples are images of work clothes; photo); in the voice recognition application scenario, the training sample is a piece of audio.

[0028] Depending on the output of the specific recognition task, the category labels of the training samples are different. Taking the neural network trained to perform work clothes recognition tasks as an example, according to the specific recognition task output, the categories of training samples can include category...

Embodiment 2

[0045] Based on the first embodiment, this embodiment discloses an optimization scheme of a deep neural network training method.

[0046] During specific implementation, after obtaining several training samples with preset category labels, a neural network is constructed first. In this embodiment, ResNet50 (residual network) is still used as the basic network to construct a neural network, and the neural network includes multiple feature extraction layers. Through the forward propagation stage of the neural network, the forward function of each feature extraction layer (such as the fully connected layer) is called in turn to obtain the output layer by layer. The last layer is compared with the target function to obtain the loss function, and the error update value is calculated. Then, the first layer is reached layer by layer through backpropagation, and all values ​​​​are updated together at the end of backpropagation. The last feature extraction layer uses the extracted fea...

Embodiment 3

[0062] The embodiment of the present application also discloses a deep neural network training method, which is applied to classification applications. Such as image 3 As shown, the method includes: Step 310 to Step 370.

[0063] Step 310, acquiring several training samples with preset class labels.

[0064] During the specific implementation of the present application, the training samples include any one of the following: images, texts, and voices. For different objects to be classified, it is necessary to obtain training samples of corresponding objects to be classified in the neural network model. In this embodiment, take the training of a neural network model for work clothes recognition as an example. First, obtain work clothes images with different platform tags, such as: work clothes images with Meituan Waimai platform tags, and Ele. The image of the work clothes with the label of the platform, the image of the work clothes with the label of the Baidu Waimai platfo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep neural network training method, belongs to the technical field of computers, and is used for solving the problem of relatively low performance of a trained neural network in a complex scene in the prior art. The method comprises the following steps: obtaining a plurality of training samples provided with preset category labels, and training a neural network model based on the plurality of training samples; wherein the loss function of the neural network model is used for carrying out weighting operation according to a first weight value in direct proportion to the distinguishing difficulty of each training sample, and determining the loss value of the neural network model. According to the deep neural network training method disclosed by the embodiment of theinvention, the importance of the training samples with higher difficulty in distinguishing in the training samples is adaptively improved, so that the samples with higher difficulty in distinguishingare prevented from being mistakenly classified by the neural network obtained by training, and the performance of the neural network is improved.

Description

technical field [0001] The present application relates to the field of computer technology, in particular to a deep neural network training method, device, electronic equipment and storage medium. Background technique [0002] In recent years, deep learning has made remarkable progress in the field of pattern recognition, and its key factors may include: rich and flexible network models, strong computing power, and better adaptability to big data processing. As neural networks are applied to different tasks, the improvement of neural network models is also a key issue for those skilled in the art to study. The improvement of the neural network model in the prior art mainly focuses on two aspects: network structure and loss function. Among them, the loss function commonly used in classification model training is mainly Softmax loss, and Center loss based on Softmax improvement at the top international conference ECCV in 2016. However, the applicant found through research on...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
Inventor 柴振华孟欢欢
Owner BEIJING SANKUAI ONLINE TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products