Supercharge Your Innovation With Domain-Expert AI Agents!

Neural network training method, device and equipment and computer readable storage medium

A neural network training and neural network technology, applied in the field of devices, neural network training methods, equipment and computer-readable storage media, can solve the problem of huge calculation amount of neural network model, reduce network calculation amount, avoid processing, and strengthen The effect of model interpretability

Pending Publication Date: 2021-02-19
WEBANK (CHINA)
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The main purpose of the present invention is to provide a neural network training method, device, equipment and computer-readable storage medium, aiming to solve the technical problem of the huge amount of calculation of the existing neural network model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network training method, device and equipment and computer readable storage medium
  • Neural network training method, device and equipment and computer readable storage medium
  • Neural network training method, device and equipment and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0065] Based on the first embodiment, a second embodiment of the neural network training method of the present invention is proposed. In this embodiment, step S103 includes:

[0066] Step S201, determining a first loss value based on the target random parameter, and determining a second loss value based on the quantization parameter;

[0067] Step S202: Determine the quantized loss value based on the first loss value and the second loss value.

[0068] In this embodiment, after the quantization parameter is obtained, the first loss value is determined according to the target random parameter, and the second loss value is determined according to the quantization parameter. Specifically, the first loss value is calculated based on the input data and the target random parameter, and according to the input data And the quantization parameter calculates the second loss value.

[0069] Specifically, in an embodiment, the step S201 includes:

[0070] Step b, determining a first los...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network training method comprising the following steps: randomly initializing a to-be-trained neural network, and obtaining target random parameters of the initializedto-be-trained neural network based on input data; quantizing the target random parameter to obtain a quantized parameter; determining a quantization loss value based on the target random parameter and the quantization parameter; and determining a target neural network based on the quantitative loss value and the initialized to-be-trained neural network. The invention further discloses a neural network training device and equipment and a computer readable storage medium. According to the method, the neural network is trained through the quantized parameters, so that the target neural network obtained through training has high model interpretability, a large number of redundant parameters in the neural network are prevented from being processed by selecting the target random parameters, andthe network calculation amount in the model training process is reduced, so that the neural network can be miniaturized and deployed in small-sized edge equipment.

Description

technical field [0001] The present invention relates to the field of neural networks, in particular to a neural network training method, device, equipment and computer-readable storage medium. Background technique [0002] With the development of artificial intelligence, deep learning has shown great advantages in the fields of image detection and speech recognition. Neural network is an important algorithm for deep learning. However, since there are a large number of redundant parameters in the neural network, the calculation of the neural network model will be huge, so it cannot be directly used in some application scenarios such as embedded devices and other small edge devices. [0003] The above content is only used to assist in understanding the technical solution of the present invention, and does not mean that the above content is admitted as prior art. Contents of the invention [0004] The main purpose of the present invention is to provide a neural network trai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/04G06N3/082
Inventor 张天豫范力欣吴锦和
Owner WEBANK (CHINA)
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More