Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A data storage method for speech deep neural network operation

A technology of deep neural network and data storage, applied in the direction of biological neural network model, electrical digital data processing, input/output process of data processing, etc. Unable to guarantee, timeliness cannot meet the requirements, etc., to achieve the effects of effectiveness and timeliness guarantee, reduction in quantity, and reduction in effective duration

Active Publication Date: 2021-06-22
成都启英泰伦科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In the prior art, one of the most commonly used implementation methods is to store data through a peripheral storage device medium and read data through a general-purpose CPU. This method is inefficient and cannot meet the continuity, timeliness and Selectivity problem; specifically, additional calculation programs are required to process the starting position of the frame, and continuity and selectivity cannot be guaranteed; the serial form reading form forms a bottleneck in terms of bandwidth, resulting in a longer calculation time
[0006] Another existing technology is to use a graphics processing unit (GPU) or DSP to process. This method performs calculations through register files and general-purpose SIMD, but the internal limited storage causes the GPU to frequently access peripheral storage devices, and the timeliness still cannot meet the requirements. ; Since it is still a general-purpose processing unit, the continuity and selectivity are still frequently involved in the calculation by the program, and its continuity and selectivity cannot be guaranteed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A data storage method for speech deep neural network operation
  • A data storage method for speech deep neural network operation
  • A data storage method for speech deep neural network operation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] Specific embodiments of the present invention will be further described in detail below.

[0044] The storage management method for voice deep neural network computing chip of the present invention comprises the following steps:

[0045] Step 1. The user determines the configuration parameters, specifically:

[0046] Determine the total number of frames, the number of skipped frames, the number of output channels and the number of single-channel output frames of the feature data to be calculated, which are defined by the user according to the requirements of this calculation;

[0047] Determining the number of data per unit frame required for deep neural network operations is defined by the user according to the requirements of this calculation; however, it must satisfy the following formula: the unit memory data depth of the feature storage array in the feature storage device ≥ deep neural network The number of data per unit frame required for network operations;

[...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A data storage method for voice deep neural network operations, comprising the following steps: Step 1. The user determines the configuration parameters; Step 2. Configures the peripheral storage access interface; Step 3. Configures the multi-transmission interface of the feature storage array; Step 4.CPU Store the data to be calculated in the storage space between the start address of the feature storage space of the peripheral storage device and the end address of the feature storage space; Step 5. The CPU opens the peripheral storage access interface to enable, and transfers the data to the feature memory array; step 6. The CPU enables the multi-transmission interface of the feature storage array, and the multi-transmission interface of the feature storage array starts to transfer data in parallel according to the configuration requirements; step 7. The CPU judges whether the data has been stored. The present invention ensures the continuity of feature data, and takes into account the format requirements of peripheral storage devices and deep neural network data. Compared with the traditional method, the data transmission time is reduced, and at the same time, the effective duration of peripheral data storage is reduced.

Description

technical field [0001] The invention belongs to the technical field of speech neural network, and relates to a data storage management technology of speech neural network, in particular to a data storage method for speech deep neural network operation. Background technique [0002] The storage requirement of voice-related deep neural network computing features lies in the continuity of its data, that is, voice features are packaged in the form of frames, and each frame of data contains the same amount of continuous data, which requires data to be stored in frames. [0003] The feature storage requirement of speech-related deep neural network operations lies in the timeliness of its data, that is, the feature data stored in the speech feature storage unit needs to complete the throughput of data storage to the operation unit within a limited time, usually in deep neural network operations. Matrix Operations. [0004] The feature storage requirement of speech-related deep neu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/06G06N3/063
CPCG06F3/0626G06F3/0611G06F3/0629G06F3/0685G06N3/063G06F3/061G06F3/0655G06F3/0679
Inventor 邱兆强张来王福君田伟杨应斌裴阳洋
Owner 成都启英泰伦科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products