Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Accelerated operation method and server for convolutional neural network, and storage medium

A convolutional neural network and accelerated computing technology, which is applied to the accelerated computing method of convolutional neural network, servers and storage media, can solve the problems of large amount of calculation and difficulty in making full use of hardware resources to accelerate computing, so as to reduce interaction time, The effect of improving computing speed and resource utilization efficiency

Active Publication Date: 2017-12-08
深圳市自行科技有限公司
View PDF3 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The main purpose of the present invention is to propose an accelerated computing method, server and storage medium for a convolutional neural network, aiming to solve the technical problem that the convolutional neural network in the prior art has a large amount of calculation and it is difficult to make full use of hardware resources to accelerate calculations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Accelerated operation method and server for convolutional neural network, and storage medium
  • Accelerated operation method and server for convolutional neural network, and storage medium
  • Accelerated operation method and server for convolutional neural network, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.

[0049] The solution of the embodiment of the present invention is mainly to obtain the map to be split, perform convolution and pooling operations on the map to be split, and split the map to be split after the convolution and pooling operations are performed into presets A number of sub-maps, and obtain the location information of each sub-map, perform cross-layer operations on each sub-map, obtain the intermediate result of each layer operation, store the intermediate result in the internal memory, and extract the internal memory The intermediate result of is involved in the next layer of calculations until the final calculation result is obtained, and the final calculation result is spliced ​​according to the position information to obtain the splicing result for subsequent network calculations. The technical so...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an accelerated operation method and server for a convolutional neural network, and a storage medium. A map to be split is split into a preset quantity of sub-maps, and the position information of each sub-map is obtained; each sub-map is independently subjected to cross-layer operation; an intermediate result of each-layer operation is obtained and is stored into an internal memory to extract the intermediate result in the internal memory so as to participate in next-layer operation; therefore, the internal memory is fully utilized, resources are reused, interaction time with an external memory is shortened, the operation speed and the resource use efficiency of the CNN (Convolutional Neural Network) are greatly improved, and therefore, the CNN can efficiently operate at a high speed in an embedded terminal.

Description

Technical field [0001] The invention relates to the field of computer applications, in particular to an accelerated operation method, server and storage medium of a convolutional neural network. Background technique [0002] Machine Learning (ML) and Artificial Intelligence (AI) are the study of how to make computers perform tasks that require human intelligence to be competent, and how to use computer software and hardware to simulate certain human behaviors and thoughts Basic theory, method and technology. AI is a multidisciplinary science, including the intersection of natural sciences and social sciences, as well as philosophy and cognitive sciences, mathematics, neurophysiology, psychology, computer science, information science, cybernetics and many other disciplines. Therefore, the theoretical research and application technology in the AI ​​field have very high barriers, involving issues such as multidisciplinary collaboration and implementation difficulties. [0003] In re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04
CPCG06N3/045
Inventor 谌璟孙庆新
Owner 深圳市自行科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products