Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

AI chip-based data processing method and device

A data processing and chip technology, applied in the field of data processing methods and equipment based on AI chips, can solve the problems of waste of computing resources, low data processing efficiency of AI chips, etc., to improve efficiency, realize parallel computing, and reduce mutual waiting time. Effect

Active Publication Date: 2020-08-04
吉林省华庆云科技集团有限公司
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a data processing method and device based on an AI chip, which is used to solve the problems in the prior art that the CPU and the NPU wait for each other to cause waste of computing resources and low data processing efficiency of the AI ​​chip.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • AI chip-based data processing method and device
  • AI chip-based data processing method and device
  • AI chip-based data processing method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] figure 1 It is a schematic diagram of data processing of an existing AI chip provided by Embodiment 1 of the present invention, figure 2 It is a schematic diagram of the pipeline structure of AI chip data processing provided by Embodiment 1 of the present invention. The embodiment of the present invention provides a data processing method based on the AI ​​chip to solve the problems in the prior art that the CPU and the NPU wait for each other to cause waste of computing resources and low data processing efficiency of the AI ​​chip. The method in this embodiment is applied to an AI chip. The AI ​​chip includes at least a first processor, a second processor, and a third processor. The third processor is used to perform neural network model calculations. The third processor includes multiple core.

[0035] Wherein, the third processor is a processor for calculating the neural network model, and the third processor includes multiple cores. For example, the third proces...

Embodiment 2

[0047] image 3 It is a flow chart of the AI ​​chip-based data processing method provided by Embodiment 2 of the present invention. On the basis of the first embodiment above, in this embodiment, the specific process of the first processor, the second processor and the third processor performing the three stages of processing simultaneously will be described in detail. Such as image 3 As shown, the specific steps of the method are as follows:

[0048] Step S301, the first processor acquires a data frame and preprocesses the data frame to obtain first data corresponding to the data frame, and stores the first data corresponding to the data frame in a first queue.

[0049] In this embodiment, after the first processor performs preprocessing on the data frame, first data corresponding to the data frame is obtained, and the first data corresponding to the data frame is a processing result obtained by performing the first-stage processing on the data frame. The first data corre...

Embodiment 3

[0080] Figure 4 It is a schematic structural diagram of the AI ​​chip provided by Embodiment 3 of the present invention. The AI ​​chip provided in the embodiment of the present invention can execute the processing flow provided in the embodiment of the data processing method based on the AI ​​chip. Such as Figure 4 As shown, the AI ​​chip 40 includes: a first processor 401, a second processor 402, a third processor 403, a memory 404, and computer programs stored in the memory.

[0081] The AI ​​chip data processing pipeline is divided into the following three stages of processing: data acquisition and preprocessing, neural network model processing, and neural network model postprocessing; the processing of the three stages is a parallel pipeline structure.

[0082] When the first processor 401 , the second processor 402 and the third processor 403 run the computer program, the AI ​​chip-based data processing method provided in any one of the above method embodiments is imp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a data processing method and equipment based on an AI chip. The method of the present invention divides the AI ​​chip data processing pipeline into the following three stages of processing: data acquisition and preprocessing, neural network model processing, and neural network model processing. Post-processing; the three-stage processing is a parallel pipeline structure; the AI ​​chip includes at least a first processor, a second processor and a third processor, the first processor is used for data acquisition and pre-processing, and the third processor is used for neural processing. Network model processing, the second processor is used for neural network model post-processing; the first processor, the second processor and the third processor perform the three-stage processing at the same time, which reduces the waiting time of the processors and maximizes the realization of It improves the parallel computing of each processor, improves the efficiency of AI chip data processing, and thus can increase the frame rate of AI chip.

Description

technical field [0001] The present invention relates to the technical field of AI chips, in particular to a data processing method and device based on AI chips. Background technique [0002] Xeye is an artificial intelligence camera, and xeye includes a sensor for collecting images and an AI chip for recognizing and processing images. An AI chip generally includes an embedded neural network processor (Neural-network Processing Unit, NPU for short) and at least two CPUs for calculating a neural network model, wherein the NPU includes multiple cores. [0003] The existing AI chip recognizes and processes the collected images frame by frame. The processing of a data frame by the AI ​​chip includes the following four modules: image acquisition, image preprocessing, neural network model processing, neural network model postprocessing and data processing. transmission. The CPU is used to run the first and second modules, the NPU is used to run the third module, and the CPU is al...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063G06K9/00G06K9/62
CPCG06N3/063G06V20/10G06F18/2413
Inventor 王奎澎寇浩锋包英泽付鹏范彦文周强周仁义胡跃祥
Owner 吉林省华庆云科技集团有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products