Miniature neural network model

A neural network model and miniature technology, applied in the field of convolution group, can solve the problems of insufficient depth of convolution, too many channels, too many convolution kernels, etc. The effect of force and memory requirements

Pending Publication Date: 2020-11-03
NORTH CHINA ELECTRIC POWER UNIV (BAODING)
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the shortcomings of insufficient convolution depth and too many convolution kernels in the prior art, resulting in too many channels of the output feature map, and propose a miniature neural network model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Miniature neural network model
  • Miniature neural network model
  • Miniature neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] S1. After preliminary extraction of the Core convolution kernel, multiple feature maps are generated. The Core convolution group in S1 contains 3×3 convolution kernels. The number of 3×3 convolution kernels in the Core convolution group is higher than that of the normal convolution group. The number is reduced, the number of channels of the input feature map of the 3×3 convolution kernel is reduced compared with the number of normal convolution groups, and it is effective by reducing the number of 3×3 convolution kernels and reducing the number of channels of the input feature map of the 3×3 convolution kernel It avoids the traditional insufficient convolution depth and too many convolution kernels, which lead to too many channels of the output feature map and a large number of parameters, which reduces the hardware computing power and memory requirements. The Core convolution group has a total of It consists of 10 Core convolution kernels and they are respectively Core4...

Embodiment 2

[0031] S1. After preliminary extraction of the Core convolution kernel, multiple feature maps are generated. The Core convolution group in S1 contains 3×3 convolution kernels. The number of 3×3 convolution kernels in the Core convolution group is higher than that of the normal convolution group. The number is reduced, the number of channels of the input feature map of the 3×3 convolution kernel is reduced compared with the number of normal convolution groups, and it is effective by reducing the number of 3×3 convolution kernels and reducing the number of channels of the input feature map of the 3×3 convolution kernel It avoids the traditional insufficient convolution depth and too many convolution kernels, which lead to too many channels of the output feature map and a large number of parameters, which reduces the hardware computing power and memory requirements. The Core convolution group has a total of It consists of 10 Core convolution kernels and they are respectively Core8...

Embodiment 3

[0037] S1. After preliminary extraction of the Core convolution kernel, multiple feature maps are generated. The Core convolution group in S1 contains 3×3 convolution kernels. The number of 3×3 convolution kernels in the Core convolution group is higher than that of the normal convolution group. The number is reduced, the number of channels of the input feature map of the 3×3 convolution kernel is reduced compared with the number of normal convolution groups, and it is effective by reducing the number of 3×3 convolution kernels and reducing the number of channels of the input feature map of the 3×3 convolution kernel It avoids the traditional insufficient convolution depth and too many convolution kernels, resulting in too many channels of the output feature map and a large number of parameters, which reduces the hardware computing power and memory requirements. The Core convolution group has a total of It consists of 10 Core convolution kernels and they are respectively Core9....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a miniature neural network model, which comprises a highly optimized Core convolution kernel sub-network and an SSD-based auxiliary feature extraction sub-network, and is characterized in that the Core convolution kernel sub-network comprises a Core convolution kernel sub-network method, and the method comprises the following steps: S1, carrying out preliminary extraction on a Core convolution kernel to generate a plurality of feature maps, S2, selecting the specific feature map and transmitting the specific feature map to an auxiliary feature extraction sub-network forprediction, S3, minimizing a model while guaranteeing the target detection precision, S3, enabling Core convolution groups in S1 to comprise 3 * 3 convolution kernels, and enabling the number of the3 * 3 convolution kernels in the Core convolution groups to be smaller than the number of normal convolution groups. The Core convolution group is composed of the low-dimensional convolution kernel, the neural network calculation parameter quantity is greatly compressed, the convolution group facilitates deployment of the neural network model in the embedded edge computing device, and a smaller Core convolution group network structure system achieves more efficient feature extraction.

Description

technical field [0001] The invention relates to the technical field of convolution groups, in particular to a miniature neural network model. Background technique [0002] Group convolution is to divide the input feature map into several groups in the channel direction, perform convolution on the features of each group and then stitch them together to reduce the number of parameters and improve the operation speed. [0003] Most traditional target detection algorithms use VGG16 and other models for bottom-level feature extraction. Such models mostly use 3×3 convolution kernels as filters for feature maps, which has two disadvantages: on the one hand, the depth of convolution is insufficient; on the other hand, if Too many convolution kernels lead to too many channels of the output feature map, resulting in a large number of parameters, which requires high hardware computing power and memory. Contents of the invention [0004] The purpose of the present invention is to sol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 律方成金潮伟王胜辉
Owner NORTH CHINA ELECTRIC POWER UNIV (BAODING)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products