Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network optimization method based on floating number operation inline function library

An inline function and neural network technology, applied in the field of pattern recognition, can solve problems such as large amount of nonlinear calculation, hindered execution efficiency of embedded processors, large amount of calculation of floating point numbers, etc., and achieve the effect of improving recognition efficiency

Inactive Publication Date: 2013-03-20
天津市天祥世联网络科技有限公司
View PDF0 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the large amount of floating-point calculations and the large amount of nonlinear calculations in the BP neural network, its execution efficiency on embedded processors is greatly hindered.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network optimization method based on floating number operation inline function library
  • Neural network optimization method based on floating number operation inline function library
  • Neural network optimization method based on floating number operation inline function library

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The present invention will be described in detail below with reference to the drawings and examples.

[0036] The neural network optimization method based on the floating-point number operation inline function library of the present invention, wherein the model of the neural unit is: Y=1 / (1+exp(-∑w i ×x i )), i ranges from 1 to n, and n is the number of neural units; the above floating-point arithmetic inline function library is built in the dual-core chip, that is, the function library IQmath Library, which includes: format conversion function, after calibration Mutual conversion between floating-point numbers and integers; arithmetic functions, to realize multiplication and division of calibrated floating-point numbers; trigonometric functions, to realize sine, cosine, tangent operations of calibrated floating-point numbers; mathematical functions, to realize calibrated floating-point numbers The root, exponent, logarithm and multiple power operations of floating-poi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Provided is a neural network optimization method based on a floating number operation inline function library, wherein the model of neural unit is Y=1 / (1+exp(-Sigma wi*xi)). The value range of i is 1 to n, and n is the number of neural units. The floating-point operations inline function is structured in a dikaryon chip, namely, function library IQ math Library. In the neural network optimization method based on a floating number operation inline function library, except that the step _IQ(x[i]) is carried out in circulation, the rest steps are all carried out outside the circulation body, and the efficiency of the execution of all the steps as a whole is greatly improved compared with a floating point arithmetic. The neural network optimization method based on a floating number operation inline function library optimizes the transplant of back propagation (BP) neural network on transcranial magnetic stimulation (TMS) 3206464T, and the accuracy of a result is decided through a beginning decimal calibration. On the premise of guaranteeing the accuracy of the result does not influence the recognition rate, BP network awareness efficiency is greatly improved.

Description

technical field [0001] The invention relates to the technical field of pattern recognition, in particular to a neural network optimization method based on an inline function library of floating-point operations. Background technique [0002] Pattern recognition technology is used more and more in intelligent products in today's society, and as a mainstream classifier for classification recognition - error backpropagation neural network (BP network) is also widely used in character, fingerprint, and face recognition. Traditional recognition algorithms are run on PCs, which is not conducive to the portability, integration and front-end of products. [0003] A neural network is an array of mathematical models of the structure and function of bionic neurons (processing units). These processing units are arranged linearly in groups called layers. Each processing unit has a number of inputs, and for each input there is an associated weight. The processing unit weights and sums ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/44G06N3/02
Inventor 谢晓霞靳璐
Owner 天津市天祥世联网络科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products