Floating-point multiplier and floating-point multiplication for neural network processor

A floating-point multiplier and multiplication technology, applied in the field of neural network processors, can solve problems such as inability to meet, reduce acceleration efficiency, hinder the application of neural network processors, etc., and achieve the effect of high performance and improved work accuracy

Active Publication Date: 2017-10-24
INST OF COMPUTING TECHNOLOGY - CHINESE ACAD OF SCI
View PDF6 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Multiplication and addition operations are important links in neural network operations. In order to reduce design complexity and improve operational efficiency, most dedicated hardware accelerators usually use fixed-point multipliers for multiplication operations, and most of the weight data obtained from training are calculated in floating points. environment, the mismatch of data storage and calculation forms between the training environment and the hardware acceleration environment leads to a large difference

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Floating-point multiplier and floating-point multiplication for neural network processor
  • Floating-point multiplier and floating-point multiplication for neural network processor
  • Floating-point multiplier and floating-point multiplication for neural network processor

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0035] In order to make the objectives, technical solutions and advantages of the present invention clearer, the following further describes the present invention in detail through specific embodiments with reference to the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention.

[0036] figure 1 It is a schematic structural diagram of a floating-point multiplier according to an embodiment of the present invention. The floating-point multiplier includes a sign bit operation unit, an order code operation unit, a mantissa operation unit and a normalization unit. Such as figure 1 As shown, the floating-point multiplier receives two operands A and B to be multiplied, and outputs their product (can be denoted as C). The operands A and B and their products are all floating-point numbers, and each floating-point number in the machine is stored and expressed in the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a floating-point multipliers and floating-point multiplication for a neural network processor. The floating-point multiplier matches the mantissas of two to-be-multiplied operands to select different operation modes to obtain the mantissa of the product, and directly outputs the mantissa of one of the operands when the mantissas of the two operands match the upper four bits; when the mantissas of the two operands match the upper three bits, the partial bits of the mantissa of the two operands are first truncated, the high bit of the truncated numbers is complemented by one, and then the multiplication is performed and the result is output; and if the condition is not satisfied, the mantissas of these two operands are multiplied to obtain the mantissa of the product. According to the floating-point multiplier disclosed by the present invention, in the implementation of the multiplication operation, a combination manner of approximate calculation and accurate calculation is used, and data replacement, partial bit multiplication, and other work with lower energy loss are used, so that without sacrificing greater working precision, the working efficiency of the multiplication operation is improved, and neural network processing system performance can be more efficient.

Description

technical field [0001] The present invention relates to neural network processors, and more particularly to multiplication operations within neural network processors. Background technique [0002] At present, neural network processors usually use trained weight data as input signals to perform calculation operations on neural network models. Multiplication and addition operations are important links in neural network operations. In order to reduce design complexity and improve operational efficiency, most dedicated hardware accelerators usually use fixed-point multipliers for multiplication operations, and most of the weight data obtained from training are calculated in floating points. environment, the mismatch of data storage and computing forms between the training environment and the hardware acceleration environment leads to a large difference between the hardware acceleration processing results and the training results. However, if the traditional floating-point mult...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F7/57G06N3/02
CPCG06F7/57G06N3/02
Inventor 韩银和许浩博王颖
Owner INST OF COMPUTING TECHNOLOGY - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products