Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

AI server computing unit architecture and implementation method

A technology of computing unit and implementation method, which is applied in a variety of digital computer combinations, computing, computers, etc., can solve the problems of few types of topology switching, large switching limitations, complex circuit design of computing unit interconnection topology switching modules, etc., and achieve optimization The power consumption of the whole machine and the effect of optimizing the uniform heat dissipation

Active Publication Date: 2020-10-02
INSPUR SUZHOU INTELLIGENT TECH CO LTD
View PDF2 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] A single architecture design leads to poor flexibility in the applicable application scenarios of the server; the complex line design of the computing unit interconnection topology switching module affects the quality of high-speed signals; the few types of topology switching lead to large switching limitations, so that the overall switching of the interconnection topology is very important for the individual computing units participating in the calculation The problem that the number cannot be adjusted flexibly, the present invention provides an AI server computing unit architecture and implementation method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • AI server computing unit architecture and implementation method
  • AI server computing unit architecture and implementation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] In order to enable those skilled in the art to better understand the technical solutions in the present invention, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described The embodiments are only some of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention. Key terms appearing in the present invention are explained below.

[0061] Such as figure 1 As shown, the embodiment of the present invention provides an AI server computing unit architecture, including a power consumption acquisition module, a control module, and a PCIe Switch chip; the PCIe Switch chip includes a first PCIe Switch chip...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an AI server computing unit architecture and an implementation method. The architecture comprises a power consumption acquisition module, a control module, a first PCIe Switch chip and a second PCIe Switch chip, the control module is in communication connection with the first PCIe Switch chip and the second PCIe Switch chip respectively; the power consumption acquisition module is used for acquiring power consumption data of the CPU and the GPU; the control module is used for acquiring power consumption data of the CPU and the GPU from the power consumption acquisition module, analyzing and processing the power consumption data and sending a control instruction packet to the PCIe Switch chip according to a processing result of the power consumption data; and on-off setting and uplink and downlink attribute setting of a PCIe Switch chip port are controlled to realize adjustment of the number of GPUs participating in calculation and adjustment of the interconnection relationship between the GPUs and the CPU through the PCIe Switch chip.

Description

technical field [0001] The present invention relates to the technical field of server computing unit architecture design, in particular to an AI server computing unit architecture and an implementation method. Background technique [0002] In order to meet the collection and arrangement of various data in the fields of big data, cloud computing, and artificial intelligence, AI servers using various heterogeneous forms have been widely used. A large number of applications of computing units satisfy the server's ability to perform intensive data operations. [0003] CPU+GPU is a combination of computing units commonly used in AI servers. NVIDIA has introduced three basic PCIe topologies for the interconnection of CPU+GPU - Balance Mode, Common Mode and Cascade Mode. In most application scenarios, the computing capabilities of the balanced mode and the general mode are similar, and the topology model cannot be better selected based on the application scenario alone; the P2P p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F15/16G06F13/42G06F15/173
CPCG06F15/161G06F13/4282G06F15/173Y02D10/00
Inventor 孙珑玲于泉泉王鹏王焕超刘闻禹闫玉婕
Owner INSPUR SUZHOU INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products