Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Data processing method and device based on AI chip

A data processing and chip technology, which is applied in the field of data processing methods and equipment based on AI chips, can solve the problems of waste of computing resources and low data processing efficiency of AI chips, and achieve the goals of improving efficiency, realizing parallel computing, and reducing mutual waiting time Effect

Active Publication Date: 2018-12-11
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a data processing method and device based on an AI chip, which is used to solve the problems in the prior art that the CPU and the NPU wait for each other to cause waste of computing resources and low data processing efficiency of the AI ​​chip.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing method and device based on AI chip
  • Data processing method and device based on AI chip
  • Data processing method and device based on AI chip

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] figure 1 It is a schematic diagram of data processing of an existing AI chip provided by Embodiment 1 of the present invention, figure 2 It is a schematic diagram of the pipeline structure of AI chip data processing provided by Embodiment 1 of the present invention. The embodiment of the present invention provides a data processing method based on the AI ​​chip to solve the problems in the prior art that the CPU and the NPU wait for each other to cause waste of computing resources and AI chip data processing efficiency is low. The method in this embodiment is applied to an AI chip. The AI ​​chip includes at least a first processor, a second processor, and a third processor. The third processor is used to perform neural network model calculations. The third processor includes multiple core.

[0035] Wherein, the third processor is a processor for calculating the neural network model, and the third processor includes multiple cores. For example, the third processor ma...

Embodiment 2

[0047] image 3 It is a flow chart of the AI ​​chip-based data processing method provided by Embodiment 2 of the present invention. On the basis of the first embodiment above, in this embodiment, the specific process of the first processor, the second processor and the third processor performing the three stages of processing simultaneously will be described in detail. like image 3 As shown, the specific steps of the method are as follows:

[0048] Step S301, the first processor acquires a data frame and preprocesses the data frame to obtain first data corresponding to the data frame, and stores the first data corresponding to the data frame in a first queue.

[0049] In this embodiment, after the first processor performs preprocessing on the data frame, first data corresponding to the data frame is obtained, and the first data corresponding to the data frame is a processing result obtained by performing the first-stage processing on the data frame. The first data correspo...

Embodiment 3

[0080] Figure 4 It is a schematic structural diagram of the AI ​​chip provided by Embodiment 3 of the present invention. The AI ​​chip provided in the embodiment of the present invention can execute the processing flow provided in the embodiment of the data processing method based on the AI ​​chip. like Figure 4 As shown, the AI ​​chip 40 includes: a first processor 401, a second processor 402, a third processor 403, a memory 404, and computer programs stored in the memory.

[0081] The AI ​​chip data processing pipeline is divided into the following three stages of processing: data acquisition and preprocessing, neural network model processing, and neural network model postprocessing; the processing of the three stages is a parallel pipeline structure.

[0082] When the first processor 401 , the second processor 402 and the third processor 403 run the computer program, the AI ​​chip-based data processing method provided in any one of the above method embodiments is implem...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data processing method and device based on an AI chip. The method of the invention divides the data processing pipeline of the AI chip into the following three stages of processing: data acquisition and pretreatment, neural network model processing and neural network model post-processing. The processing of the three phases is of a parallel pipeline structure. The AI chipcomprises at least a first processor, a second processor and a third processor. The first processor is used for data acquisition and preprocessing, the third processor is used for neural network modelprocessing, and the second processor is used for neural network model post-processing. The first processor, the second processor and the third processor simultaneously carry out the processing of thethree stages, thereby reducing the mutual waiting time of the processors, maximizing the parallel computation of each processor, improving the efficiency of data processing of the AI chip, and then improving the frame rate of the AI chip.

Description

technical field [0001] The present invention relates to the technical field of AI chips, in particular to a data processing method and device based on AI chips. Background technique [0002] Xeye is an artificial intelligence camera, and xeye includes a sensor for collecting images and an AI chip for recognizing and processing images. An AI chip generally includes an embedded neural network processor (Neural-network Processing Unit, NPU for short) and at least two CPUs for calculating a neural network model, wherein the NPU includes multiple cores. [0003] The existing AI chip recognizes and processes the collected images frame by frame. The processing of a data frame by the AI ​​chip includes the following four modules: image acquisition, image preprocessing, neural network model processing, neural network model postprocessing and data processing. transmission. The CPU is used to run the first and second modules, the NPU is used to run the third module, and the CPU is al...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06K9/00G06K9/62
CPCG06N3/063G06V20/10G06F18/2413
Inventor 王奎澎寇浩锋包英泽付鹏范彦文周强周仁义胡跃祥
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products