Supercharge Your Innovation With Domain-Expert AI Agents!

CNN model, training method thereof, terminal and computer readable storage medium

A training method and model technology, applied in the field of CNN model training, can solve the problems of reduced model training performance and high economic cost, and achieve the effect of ensuring the effect of batch normalization

Pending Publication Date: 2019-12-20
INSPUR ARTIFICIAL INTELLIGENCE RES INST CO LTD SHANDONG CHINA
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The former leads to a decrease in model training performance, while the latter requires a higher economic cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • CNN model, training method thereof, terminal and computer readable storage medium
  • CNN model, training method thereof, terminal and computer readable storage medium
  • CNN model, training method thereof, terminal and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0051] A CNN model training method of the present invention is applied to a CNN model with a linear structure, and the CNN model has a batch normalization layer.

[0052] The training method performs model segmentation and data batching on the CNN model according to the size of batch training samples and computer storage resources.

[0053] Among them, the model segmentation is: a synchronization layer is inserted between each batch normalization layer and the previous network layer adjacent to it, and the synchronization layer is used to temporarily store the output of the previous network layer adjacent to it, and is used for Start the batch operation of the input data by the batch normalization layer adjacent to it; all network layers between the input layer and its adjacent synchronization layer, and all network layers between any two adjacent synchronization layers are respectively A single network layer unit.

[0054] Data batching is: For all the above network layer un...

Embodiment 2

[0085] In the second aspect, a CNN model of the present invention has a linear structure, and is a model obtained through training of a CNN model training method disclosed in Embodiment 1.

[0086] In this embodiment, the CNN model includes an input layer, a convolutional layer, a fully connected layer, an activation layer, a batch normalization layer and a synchronization layer.

[0087] The input layer is used to input training samples, the convolutional layer and the fully connected layer are used to perform convolution calculations to extract features, and the total number of convolutional layers and fully connected layers is N; the order of the activation layer and the convolutional layer and the fully connected layer Stacking; there are a total of N batch normalization layers, and each batch normalization layer is located behind its corresponding convolutional layer and fully connected layer without intervals; there are N synchronization layers and one-to-one corresponden...

Embodiment 3

[0099] A terminal of the present invention includes a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory are connected to each other, the memory is used to store a computer program, the computer program includes program instructions, and the processor is configured to The method for training a CNN model as disclosed in Embodiment 1 is executed by calling the above-mentioned program instructions.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a CNN model, a training method thereof, a terminal and a computer readable storage medium, belongs to the field of CNN model training, and aims to solve the problem of how to analyze large-batch samples in batches on the premise of ensuring the training performance of the model. The method comprises the following steps: inserting a synchronization layer between each batch normalization layer and a previous network layer adjacent to the batch normalization layer, wherein all network layers between an input layer and the adjacent synchronization layer and all network layers between any two adjacent synchronization layers are independent network layer units respectively; and performing forward propagation on all the network layer units section by section in a data batch mode, wherein the CNN model is of a linear structure and is a model trained by the training method. And a processor in the terminal is configured to call the program instruction to execute the training method. In a computer readable storage medium, when a program instruction is executed by a processor, the processor executes the training method.

Description

technical field [0001] The invention relates to the field of CNN model training, in particular to a CNN model and its training method, a terminal, and a computer-readable storage medium. Background technique [0002] Batch normalization (English full name BatchNormalization, English abbreviation BN) can speed up the convergence of the deep neural network model, reduce the requirements for parameter initialization, and has scale invariance, making the model loss function surface smoother, so this normalization method It is widely used in deep models, especially for convolutional neural network layers. The batch normalization operator is generally placed between the convolutional layer (or fully connected layer) and the activation layer. The operator transforms the net activation value x output by the previous layer. The formula is as follows: [0003] [0004] Among them, μ and σ are the mean and variance of the set to which the net activation value x belongs, respectivel...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/08G06N3/045
Inventor 高岩姜凯郝虹于治楼李朋
Owner INSPUR ARTIFICIAL INTELLIGENCE RES INST CO LTD SHANDONG CHINA
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More