Dynamic field self-adaption method and device and computer readable storage medium

An adaptive, domain-based technology, applied in the field of convolutional neural networks, can solve difficult-to-satisfy problems and achieve the effects of improved accuracy, improved transfer learning, and improved convergence speed

Active Publication Date: 2019-08-16
UNIV OF ELECTRONICS SCI & TECH OF CHINA ZHONGSHAN INST +1
View PDF5 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in practical applications, these t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic field self-adaption method and device and computer readable storage medium
  • Dynamic field self-adaption method and device and computer readable storage medium
  • Dynamic field self-adaption method and device and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] refer to figure 1 As shown, the embodiment of the present invention provides a dynamic domain adaptive method, including the following steps:

[0053] S101. Obtain a source domain dataset and a target domain dataset based on the original image data;

[0054] S102. Define the parameters of each level of the convolutional neural network for image recognition based on the source domain data set, and add an adaptive layer before the output layer of the fully connected layer in the convolutional neural network;

[0055] S103. Calculate the adaptive loss L of the convolutional neural network MMD ;

[0056] S104, based on adaptive loss L MMD Adjust the parameters of each level of the convolutional neural network to obtain the adjusted convolutional neural network;

[0057] S105. Substitute the target domain data set into the adjusted convolutional neural network to obtain an output result of image data recognition.

[0058] Understandably, for image recognition research, ...

Embodiment 2

[0095] On the basis of Embodiment 1, Embodiment 2 of the present invention provides a method flow for training a neural network with a backpropagation algorithm. refer to figure 2 As shown, its method flow includes the following steps:

[0096] S201. Initialize the training times variable to 0;

[0097] S202. Select a part of training data in the source domain data set, that is, a batch (Batch);

[0098] S203. Obtain an output prediction value through forward propagation;

[0099] S204. Calculate the loss, and update various parameters of the neural network through the backpropagation algorithm;

[0100] S205, judging whether the training expectation is met, if it is reached, then skip to S208, if not, then skip to S201;

[0101] S206, judging whether the set number of training times is reached, if the number of training times is reached, then jump to S208, if not, then jump to S207;

[0102] S207, adding 1 to the number of training times, and jumping to S202;

[0103] ...

Embodiment 3

[0106] refer to image 3 As shown, the specific hardware structure of a dynamic domain adaptive device provided by Embodiment 3 of the present invention, the dynamic domain adaptive device 3 may include: a memory 32 and a processor 33 ; each component is coupled together through a communication bus 31 . Understandably, the communication bus 31 is used to realize connection and communication between these components. In addition to the data bus, the communication bus 31 also includes a power bus, a control bus and a status signal bus. But for clarity, in image 3 The various buses are denoted as communication bus 31 in FIG.

[0107] The memory 32 is used to store the dynamic domain adaptive method program capable of running on the processor 33;

[0108] The processor 33 is configured to perform the following steps when running the dynamic domain adaptive method program:

[0109] Obtain a source domain dataset and a target domain dataset based on the original image data;

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a dynamic field self-adaption method and device and a computer readable storage medium. The method comprises the following steps of acquiring a source domain data set and a target domain data set based on original image data; defining each level parameter of the convolutional neural network for image recognition based on the source domain data set, and adding an adaptive layer in front of an output layer in a full connection layer in the convolutional neural network; calculating adaptive loss LMMD of the convolutional neural network; modifying each level of parameter ofthe convolutional neural network based on the adaptive loss LMMD to obtain an adjusted convolutional neural network; and substituting the target domain data set into the adjusted convolutional neuralnetwork to obtain an output result of image data identification. The invention also discloses a dynamic field adaptive device and a computer readable storage medium.

Description

technical field [0001] The present invention relates to the technical field of convolutional neural networks, in particular to a dynamic domain self-adaptive method, device and computer-readable storage medium. Background technique [0002] Convolutional Neural Networks (CNN) is a type of Feed-forward Neural Networks (Feed-forward Neural Networks) that includes convolution calculations and has a deep structure. It is one of the core algorithms in the field of deep learning. The convolutional neural network mainly includes the following parts: (1) local perception. Taking an image as an example, the local pixel correlation in the spatial connection of the image is strong, while the pixel correlation at a far distance is weak; therefore, each neuron actually only needs to perceive the local area, and does not need to perceive the global image perception. (2) Weight sharing. The weights and biases of the same convolution kernel in a convolutional neural network are the same....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/084G06N3/044G06N3/045G06F18/217Y02T10/40
Inventor 刘贵松解修蕊杨泽衡张绍楷占求港
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA ZHONGSHAN INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products