Deep neural network training method and device, electronic equipment and storage medium

A technology of deep neural network and training method, which is applied in the field of deep neural network training method, electronic equipment and storage medium, and device, and can solve the problems of limited improvement of neural network classification or recognition results, so as to improve the importance and accuracy , to avoid misclassification effects

Active Publication Date: 2019-05-28
BEIJING SANKUAI ONLINE TECH CO LTD
View PDF3 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the applicant found through research on the neural network using Center loss as the loss function in the prior art that if there are samples with larger noise or weaker discrimination in the training sam

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network training method and device, electronic equipment and storage medium
  • Deep neural network training method and device, electronic equipment and storage medium
  • Deep neural network training method and device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0023] Example one

[0024] This embodiment discloses a deep neural network training method, such as figure 1 As shown, the method includes: step 110 and step 120.

[0025] Step 110: Obtain a number of training samples set with preset category labels.

[0026] Before training the neural network, it is necessary to obtain several training samples with preset category labels.

[0027] According to different specific application scenarios, the forms of training samples are different. For example, in the work clothes recognition application, the training sample is the work clothes image; in the live face detection application scenario, the training sample image collection device collects the live face image and the non-live face (such as face model, face Photo); In a voice recognition application scenario, the training sample is a piece of audio.

[0028] Depending on the output of the specific recognition task, the category labels of the training samples are different. Taking a neural n...

Example Embodiment

[0044] Example two

[0045] Based on the first embodiment, this embodiment discloses an optimization scheme of a deep neural network training method.

[0046] In specific implementation, after obtaining several training samples set with preset category labels, the neural network is first constructed. In this embodiment, ResNet50 (residual network) is still used as the basic network to construct a neural network, and the neural network includes multiple feature extraction layers. Through the forward propagation stage, the neural network calls the forward function of each feature extraction layer (such as the fully connected layer) in turn to obtain the layer-by-layer output. The last layer is compared with the objective function to obtain the loss function and calculate the error update value. Then, the first layer is reached layer by layer through backpropagation, and the ownership value is updated together at the end of backpropagation. The last feature extraction layer uses the...

Example Embodiment

[0061] Example three

[0062] The embodiment of the application also discloses a deep neural network training method, which is applied to classification applications. Such as image 3 As shown, the method includes: step 310 to step 370.

[0063] Step 310: Obtain a number of training samples set with preset category labels.

[0064] When the application is specifically implemented, the training sample includes any one of the following: image, text, and voice. For different objects to be classified, in the neural network model, it is necessary to obtain training samples of the objects to be classified. In this embodiment, taking the training of a neural network model for work clothes recognition as an example, first, obtain work clothes images set with different platform labels, such as: work clothes images set with Meituan takeaway platform labels, set with hungry Work clothes images with tags on the Mo platform, work clothes images with Baidu food delivery platform tags, etc.

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a deep neural network training method, belongs to the technical field of computers, and is used for solving the problem of relatively low performance of a trained neural network in a complex scene in the prior art. The method comprises the following steps: obtaining a plurality of training samples provided with preset category labels, and training a neural network model based on the plurality of training samples; wherein the loss function of the neural network model is used for carrying out weighting operation according to a first weight value in direct proportion to the distinguishing difficulty of each training sample, and determining the loss value of the neural network model. According to the deep neural network training method disclosed by the embodiment of theinvention, the importance of the training samples with higher difficulty in distinguishing in the training samples is adaptively improved, so that the samples with higher difficulty in distinguishingare prevented from being mistakenly classified by the neural network obtained by training, and the performance of the neural network is improved.

Description

technical field [0001] The present application relates to the field of computer technology, in particular to a deep neural network training method, device, electronic equipment and storage medium. Background technique [0002] In recent years, deep learning has made remarkable progress in the field of pattern recognition, and its key factors may include: rich and flexible network models, strong computing power, and better adaptability to big data processing. As neural networks are applied to different tasks, the improvement of neural network models is also a key issue for those skilled in the art to study. The improvement of the neural network model in the prior art mainly focuses on two aspects: network structure and loss function. Among them, the loss function commonly used in classification model training is mainly Softmax loss, and Center loss based on Softmax improvement at the top international conference ECCV in 2016. However, the applicant found through research on...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/04G06N3/08
Inventor 柴振华孟欢欢
Owner BEIJING SANKUAI ONLINE TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products