Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A data interaction method and computing device between a main CPU and an NPU

A computing device, a technology of the device, applied in the direction of computing, reasoning method, electrical digital data processing, etc., can solve the problem of high time cost and so on

Active Publication Date: 2021-10-01
HUAWEI TECH CO LTD
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In some practical application scenarios (for example, automatic driving has high requirements for real-time performance), the time overhead on the command path and response path of such an AI model reasoning is very large

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A data interaction method and computing device between a main CPU and an NPU
  • A data interaction method and computing device between a main CPU and an NPU
  • A data interaction method and computing device between a main CPU and an NPU

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] Embodiments of the present application are described below in conjunction with the accompanying drawings. Apparently, the described embodiments are only part of the embodiments of the present application, not all of the embodiments. Those of ordinary skill in the art know that, with the development of technology and the emergence of new scenarios, the technical solutions provided in the embodiments of the present application are also applicable to similar technical problems.

[0087] The terms "first", "second" and the like in the specification and claims of the present application and the above drawings are used to distinguish similar objects, and are not necessarily used to describe a specific sequence or sequence. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein can be practiced in sequences other than those illustrated or described herein. Furthermore, the terms "comprising" ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the present application discloses a data interaction method between a main CPU and an NPU and a computing device, which can be applied in the field of artificial intelligence. The method is applied to a computing device. The computing device includes a main CPU and an NPU, and the main CPU runs a target APP. The main CPU includes an AI Service and an NPU driver, and the NPU includes an NPU controller, an operation logic unit, and N registers. The method includes: after the main CPU loads the AI ​​model to the NPU, the NPU allocates a register for the AI ​​model, and the main CPU receives After the physical address of the register sent by the NPU, the physical address will be mapped to the virtual address of the virtual memory on the target APP. The main CPU can actually directly read / write the corresponding register on the NPU through the target APP, which is equivalent to the relationship between the target APP and the NPU. There is a direct connection. When the main CPU sends the execution command of the AI ​​model to the NPU through the target APP to obtain the execution result of the AI ​​model, the calculation path bypasses the AI ​​Service and the NPU driver, and only the overhead of register read / write improves the reasoning of the AI ​​model. real-time.

Description

technical field [0001] The present application relates to the technical field of artificial intelligence (AI), and in particular to a data interaction method and computing device between a main CPU and an NPU. Background technique [0002] With the wide application of AI, deep learning has become the mainstream method of current AI research and application. Faced with the parallel computing of massive data, AI's requirements for computing power continue to increase. Therefore, higher computing speed and power consumption of hardware are put forward. requirements. In addition to the general-purpose central processing unit (central processing unit, CPU), hardware-accelerated graphics processing unit (graphics processing unit, GPU), neural network processor (neural networks processunit, NPU), field programmable logic gate array (field Some chip processors such as programmable gate array (FPGA) also play their respective advantages in different applications of deep learning. A...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/50G06N5/04
CPCG06F9/5027G06N5/04G06F15/163G06N3/063
Inventor 朱湘毅
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products