Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolution implementation method and convolution implementation device of neural network, and terminal equipment

A technology of neural network and implementation method, which is applied in the field of neural network and can solve the problems that computing resources cannot be fully utilized, neural network algorithms are complex and changeable, and computing efficiency is low.

Active Publication Date: 2020-05-19
SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Neural network algorithms are complex and changeable, requiring massive calculations and massive data access. The current mainstream architecture of neural network processors is a Central Processing Unit (CPU)-like architecture, facing the problem of "storage wall" , the convolutional layer in the neural network needs to repeatedly read data from the memory when performing convolution calculations, so that the computing resources cannot be fully utilized, and the computing efficiency is low.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolution implementation method and convolution implementation device of neural network, and terminal equipment
  • Convolution implementation method and convolution implementation device of neural network, and terminal equipment
  • Convolution implementation method and convolution implementation device of neural network, and terminal equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In the following description, specific details such as specific system structures and technologies are presented for the purpose of illustration rather than limitation, so as to thoroughly understand the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments without these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.

[0032] It should be understood that when used in this specification and the appended claims, the term "comprising" indicates the presence of described features, integers, steps, operations, elements and / or components, but does not exclude one or more other features. , whole, step, operation, element, component and / or the presence or addition of a collection thereof.

[0033] It shou...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is applicable to the technical field of neural networks, and provides a convolutional implementation method and a convolution implementation device of a neural network, terminal equipment and a computer readable storage medium. The method comprises the steps of inputting a to-be-processed image into the neural network; obtaining convolution groups to be partitioned from all convolution layers of the neural network; obtaining all input channel data of a first to-be-blocked convolution layer in the to-be-blocked convolution group according to the to-be-processed image; partitioningall input channel data of the first convolution layer to be partitioned; according to all blocks of all input channel data of the first to-be-blocked convolution layer, obtaining an output result ofthe last to-be-blocked convolution layer in the to-be-blocked convolution group, and storing the output result of the last to-be-blocked convolution layer into a memory; and inputting the output result of the last to-be-blocked convolutional layer into a specified network of the neural network. According to the method and the device, data access to the memory in the convolution calculation processcan be reduced, so that the convolution calculation efficiency is improved.

Description

technical field [0001] The present application belongs to the technical field of neural networks, and in particular relates to a neural network convolution implementation method, convolution implementation device, terminal equipment, and computer-readable storage medium. Background technique [0002] A neural network is an operational model consisting of a large number of nodes (or neurons) connected to each other. Neural network algorithms are complex and changeable, requiring massive calculations and massive data access. The current mainstream architecture of neural network processors is a Central Processing Unit (CPU)-like architecture, facing the problem of "storage wall" , the convolutional layer in the neural network needs to repeatedly read data from the memory when performing convolution calculations, so that the computing resources cannot be fully utilized and the computing efficiency is low. Contents of the invention [0003] The present application provides a n...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04
CPCG06N3/04G06N3/045
Inventor 曹庆新李炜
Owner SHENZHEN INTELLIFUSION TECHNOLOGIES CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products