Neural network model training method, electronic device and storage medium

A technology of neural network model and training method, applied in the field of neural network model training method, electronic equipment and storage medium, can solve problems affecting calculation speed and calculation accuracy, and achieve the effect of ensuring calculation speed and calculation accuracy

Inactive Publication Date: 2018-05-08
深圳市深网视界科技有限公司
View PDF3 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the deficiencies of the prior art, one of the purposes of the present invention is to provide a training method for a neural network model to solve the problem that the existing model compression technology affects the calculation speed and calculation accuracy
[0006] The second object of the present invention is to provide an electronic device to solve the problem that the existing model compression technology affects the calculation speed and calculation accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network model training method, electronic device and storage medium
  • Neural network model training method, electronic device and storage medium
  • Neural network model training method, electronic device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] Below, the present invention will be further described in conjunction with the accompanying drawings and specific implementation methods. It should be noted that, under the premise of not conflicting, the various embodiments described below or the technical features can be combined arbitrarily to form new embodiments. .

[0036] like figure 1 As shown, the training method of the neural network model provided by Embodiment 1 of the present invention includes:

[0037] Step S101: Obtain the first image feature and the second image feature; wherein, the first image feature is the image feature output by the picture A after the model that has been trained, and the second image feature is the picture A after the model to be trained The output image features.

[0038] Specifically, such as figure 2 As shown, the trained model is based on a convolutional neural network. The structure of the model includes a backbone network, a feature comparison layer, and a classification...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a neural network model training method. The method comprises a step of obtaining a first image feature and a second image feature, wherein the first image feature is an image feature outputted by a picture A through a trained model, and the second image feature is an image feature outputted by the image A through a model to be trained, a step of obtaining a classification probability which is a classification probability outputted by the second image feature through a classification layer of the model to be trained, and a step of updating the parameters of the model tobe trained until the second image feature and the first image feature are fitted according to the first image feature, the second image feature and the classification probability. The invention also discloses an electronic device and a storage medium. The invention provides the neural network model training method, the electronic device and the storage medium, the parameters of the model to be trained are updated according to the first image feature, the second image feature and the classification probability, a small model with the same precision of a trained large model is obtained through training, and the calculation speed and accuracy are ensured.

Description

technical field [0001] The invention relates to the field of model training, in particular to a training method of a neural network model, electronic equipment and a storage medium. Background technique [0002] The convolutional neural network includes multiple convolutional layers, activation function layers, and downsampling layers. Its local sampling and weight sharing characteristics make it possible to obtain translation-invariant features. The extracted features far exceed the traditional hand-designed features, which greatly improves the The accuracy of machine vision tasks such as recognition, detection, and segmentation has fewer parameters than traditional neural networks and is easy to train. [0003] With the development of technology, the trend of convolutional neural network is that the deeper the model, the higher the accuracy. The improvement of accuracy makes many visual tasks practical, but also brings a huge amount of calculation. The deeper the model, th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06K9/62
CPCG06N3/084G06F18/214G06F18/241
Inventor 徐鹏飞赵瑞
Owner 深圳市深网视界科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products