Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural-network-model compression method, system and device and readable storage medium

A neural network model and neural network technology, applied in the field of neural network model compression methods, devices, computer-readable storage media, and systems, can solve problems such as reduced computing performance and reduced model accuracy, so as to reduce the size of the model and ensure that there is no Losses, resolving effects that consume too many resources

Inactive Publication Date: 2018-06-29
ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
View PDF1 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Although the usual model compression method reduces the size of the model and stores the model parameters in the form of a sparse matrix, the accuracy of the model inevitably decreases
In addition, there are also compression methods that adopt the method of retraining the compressed model, which reduces the loss of model accuracy, but the computing performance of the model reasoning prediction is significantly reduced.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural-network-model compression method, system and device and readable storage medium
  • Neural-network-model compression method, system and device and readable storage medium
  • Neural-network-model compression method, system and device and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0048] The embodiment of the invention discloses a neural network model compression method, system, device and computer-readable storage medium, so as to ensure the accuracy of the neural network model while compressing the neural network model.

[0049] see figure 1 , a neural network model compression method provided in an embodiment of the present invention, specifically comprising:

[0050] S101. Using a neural network clipping method to clip the neural networ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural-network-model compression method, system and device and a computer-readable storage medium. The method includes: utilizing a neural-network clipping method to clip a to-be-clipped neural network model to obtain a to-be-quantized neural network model; utilizing an INQ algorithm to quantize the to-be-quantized neural network model to obtain a to-be-stored neural network model; and using a compression format to store the to-be-stored neural network model. Therefore, according to the neural-network-model compression method provided by the embodiment of the invention, the neural network model is cropped, the INQ algorithm is used at the same time to quantify the same after cropping, thus a model size can be reduced in a case of effectively guaranteeing no loss of model precision after compression, thus the problem of excessive consumed resources can be solved, and calculation can be accelerated.

Description

technical field [0001] The present invention relates to the field of artificial intelligence, more specifically, to a neural network model compression method, system, device and computer-readable storage medium. Background technique [0002] In today's era, whether it is in daily life or in the Internet world, there is a word that cannot be avoided, AI (Artificial Intelligence), that is, artificial intelligence. The application of AI has penetrated into many aspects, such as face recognition, speech recognition, text processing, Go games, game battles, automatic driving, picture beautification, lip reading, and even simulation of stratum fractures. In many aspects, its accuracy and ability to deal with problems have surpassed that of humans, so it has a very broad application prospect and imagination space. Among the algorithm technologies in the field of AI, deep learning has attracted widespread attention from academia and industry since it won the ImageNet competition wi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/082G06N3/045
Inventor 谢启凯吴韶华
Owner ZHENGZHOU YUNHAI INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products