Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Acceleration method and device of data operation, terminal and readable storage medium

A data and data group technology, applied in the field of artificial intelligence, can solve the problems of slow feedforward neural network operation process, affecting the normal operation of the terminal, etc., to achieve optimal parallel computing capability, improve operation speed, and reduce the number of data reads. Effect

Active Publication Date: 2018-10-19
TENCENT TECH (SHENZHEN) CO LTD
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The main purpose of the embodiments of the present invention is to provide an acceleration method, device, terminal and readable storage medium for data calculation, which can solve the problem that the calculation process of the feedforward neural network in the prior art is relatively slow and will affect the normal operation of the terminal. technical problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Acceleration method and device of data operation, terminal and readable storage medium
  • Acceleration method and device of data operation, terminal and readable storage medium
  • Acceleration method and device of data operation, terminal and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the purpose, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described The embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making creative efforts belong to the protection scope of the present invention.

[0028] In the prior art, due to the structural characteristics of the CPU itself, when the calculation of each layer of the feedforward neural network is implemented based on the CPU, there are technical problems of slow calculation speed and high CPU occupancy.

[0029] In order to solve the above problems, an embodiment of the present inven...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses an acceleration method and device of data operation, a terminal and a readable storage medium. The method comprises the steps of reading a data group to be operated in each storage unit, buffering the read data group to a GPU and calling the GPU to carry out parallel operation on the buffered data group, wherein each storage unit stores a data group formedby data elements at the same position in each matrix data to be calculated. Compared with the prior art, in the operation process of the embodiment of the invention, the number of times of data reading in the operation process is reduced, at the same time, the operation speed of the data to be calculated can be effectively improved, in addition, since the operation process is transferred from a CPU to a GPU in the embodiment of the invention, the occupation rate of the CPU also can be reduced, and the normal operation of the terminal is ensured.

Description

technical field [0001] The present invention relates to the technical field of artificial intelligence, in particular to an acceleration method, device, terminal and readable storage medium for data calculation. Background technique [0002] With the rapid development of AI (Artificial Intelligence, artificial intelligence) technology, many AI projects have begun to land on terminals. At present, most AI projects are based on feed-forward neural networks. Feed-forward neural networks have an input layer and an output layer, starting from the input layer, passing through a series of calculation layers, and finally producing the output. In the calculation process, each neuron is only connected to the neurons of the previous layer, receives the output of the previous layer, and outputs to the next layer. There is no feedback between layers, so the performance bottleneck of the feedforward neural network is mainly concentrated in operation of each layer. [0003] At present, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06T1/20
CPCG06N3/063G06T1/20
Inventor 尚海豹李昊沅左小祥周蔚李峰程君
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products