Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network weight matrix splitting and combining method

A technology of weight matrix and neural network, applied in the field of deep learning, can solve problems such as time-consuming and resource-consuming, and achieve the effect of saving training time and simplifying training steps

Active Publication Date: 2019-10-22
WUHAN UNIV OF SCI & TECH
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0014] The purpose of the present invention, that is, the technical problem to be solved by the present invention is: to solve the existing target detection task, when updating the trained model, the entire network needs to be retrained, which consumes time and resources, and provides a Only need to train the newly added category and add it to the original model or delete the previously unnecessary category to combine into a new model, the method of splitting and combining the neural network weight matrix

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network weight matrix splitting and combining method
  • Neural network weight matrix splitting and combining method
  • Neural network weight matrix splitting and combining method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0055] This embodiment takes YOLO v3 (the third edition of the YOLO series) as an example to illustrate a method for splitting and combining the neural network weight matrix for YOLO v3 in the present invention.

[0056] Let's take the standard 416*416 yolo v3 of 80 categories trained on the coco training set as an example, and the layer number is based on a total of 106 layer numbers.

[0057] Such as figure 1 Shown, the brief introduction of YOLOv3 network structure in the present invention is as follows:

[0058] Input: three-channel color image (416*416*3);

[0059] Output: Predict the prediction results of three scales (13*13, 26*26, 52*52) of different size targets respectively.

[0060] Daranet-53: A 53-layer feature extraction network that can extract the abstract features of pictures.

[0061] in,

[0062] CBL: convolution layer (Conv) + batch normalization layer (BN) + activation layer (L-relu), mainly performs convolution operations.

[0063] Res_unit: Residual...

Embodiment 2

[0104] Use the keras advanced deep learning framework to modify the weight file, that is, to modify the convolutional layer of the weight matrix model. Let's take modifying the yolo v3 weight file as an example.

[0105] First determine the type of object to be detected (the original contains 80 categories), and train yolov3 by collecting image data to obtain a convolutional neural network weight with better effect, called the original weight matrix W1, which can be used for the task of target detection.

[0106] 1. The existing category 1 is no longer needed or needs to be updated to be processed, so it is necessary to extract this category, that is, to reduce the 80 categories to 79 categories.

[0107] correspond figure 1 The y1, y2, y3 outputs 13*13*255, 26*26*255, 52*52*255 should be modified to 13*13*252, 26*26*252, 52*52*252.

[0108] 1.1 After analysis, it can be known that the original weight matrix W1 has three output convolutional layers ( figure 1 3 C layers in ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network weight matrix splitting and combining method. The method is applied to target detection with a one-stage network structure. When target detection is carried out, firstly, the type of an object to be detected is determined, training is carried out by collecting picture data, and an original weight matrix is obtained; according to the original N categories, when one or more categories do not need to be removed any more, the neural network weight matrix is splitted, namely one or more categories are split and extracted in the original weight matrix; when one or more categories need to be updated or new categories need to be added, one or more categories in the original weight matrix are extracted for independent training, and then the categories are combined, added and merged through the weight matrix combination. According to the method, free splitting and merging of the target detection model can be realized, the training time is saved, the training steps are simplified, the degree of freedom is relatively high for different scales, and the method has a certain popularization value.

Description

technical field [0001] The invention belongs to the technical field of deep learning, and relates to a method for splitting and combining neural network weight matrices, in particular to a splitting and combining method for YOLO series neural network weight matrices in image target detection. Background technique [0002] The concept of deep learning originates from the research of artificial neural network, which is a branch of machine learning and an algorithm that uses artificial neural network as the framework to perform representation learning on data. Deep learning combines low-level features to form more abstract high-level representation attribute categories or features to discover distributed feature representations of data. Among them, object detection is a computer technology related to computer vision and image processing, which involves detecting instances of semantic objects of a specific class (such as people, buildings or cars) in digital images and videos. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/32G06K9/62G06N3/04G06F16/583
CPCG06F16/583G06V10/25G06N3/045G06F18/214
Inventor 邓春华刘子威林云汉朱子奇丁胜
Owner WUHAN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products