Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, device and storage medium for improving prediction performance of deep learning network

A deep learning network and performance technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of long training time and long prediction time, so as to speed up the development cycle, improve the training time, and improve the network prediction performance. Effect

Active Publication Date: 2020-12-18
SUZHOU KEDA TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present application provides a method, device and storage medium for improving the prediction performance of a deep learning network, which can solve the problems of long training time and long prediction time when improving the prediction performance of a deep learning network through single model fusion or multi-model fusion question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device and storage medium for improving prediction performance of deep learning network
  • Method, device and storage medium for improving prediction performance of deep learning network
  • Method, device and storage medium for improving prediction performance of deep learning network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The specific implementation manners of the present application will be further described in detail below in conjunction with the drawings and embodiments. The following examples are used to illustrate the present application, but not to limit the scope of the present application.

[0049] First, some terms involved in this application are explained.

[0050] Neural Networks (NN) model: a complex network system formed by the interconnection of a large number of simple processing units (called neurons), which reflects many basic characteristics of human brain functions and is a highly complex nonlinear deep learning system. The neural network model of deep learning has large-scale parallelism, distributed storage and processing, self-organization, self-adaptation and self-learning capabilities, and can be used for network prediction, such as: network prediction for whether you are on the phone while driving, face recognition For network prediction scenarios such as iden...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present application relates to a method, device and storage medium for improving the predictive performance of a deep learning network, which belongs to the field of artificial intelligence and computer technology. The method includes: using a preset training set to perform m rounds of iterations on the first neural network model Training; use the pre-set verification set to verify the trained neural network model; determine the corresponding performance indicators based on the scene where the trained neural network model is applied; select and determine from the neural network model obtained by m rounds of iterative training Multiple neural network models whose performance indicators reach the fusion standard; obtain the network parameters of the selected multiple neural network models, and fuse the network parameters of the multiple neural network models; assign the fused network parameters to the second neural network model A neural network model with fused network parameters is obtained; not only does not increase the time for training the model, but also improves the efficiency of network prediction, and can also meet the different needs of application scenarios.

Description

technical field [0001] The present application relates to a method, device and storage medium for improving the prediction performance of a deep learning network, and belongs to the field of artificial intelligence and computer technology. Background technique [0002] Deep learning refers to a collection of algorithms that use various machine learning (Machine Learning) algorithms to solve various problems such as images and texts on a multi-layer neural network. The core of deep learning is feature learning, which aims to obtain hierarchical feature information through layered networks. In order to improve the prediction accuracy of the neural network model, the neural network models in various situations can be fused. [0003] Model fusion includes single model fusion and multi-model fusion. Single model fusion includes multi-layer feature fusion and network snapshot fusion. Multi-layer feature fusion is to fuse features with complementary information in different layer...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V20/597G06N3/045G06F18/24G06F18/25
Inventor 刘通牛群遥朱林孙茂芬章勇曹李军吴仁良杨浩
Owner SUZHOU KEDA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products