Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for improving the prediction performance of a depth learning network, and a storage medium

A deep learning network and performance technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of long training time and long prediction time, so as to speed up the development cycle, improve the training time, and improve the network prediction performance. Effect

Active Publication Date: 2019-02-22
SUZHOU KEDA TECH
View PDF7 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present application provides a method, device and storage medium for improving the prediction performance of a deep learning network, which can solve the problems of long training time and long prediction time when improving the prediction performance of a deep learning network through single model fusion or multi-model fusion question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for improving the prediction performance of a depth learning network, and a storage medium
  • Method and device for improving the prediction performance of a depth learning network, and a storage medium
  • Method and device for improving the prediction performance of a depth learning network, and a storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The specific implementation manners of the present application will be further described in detail below in conjunction with the drawings and embodiments. The following examples are used to illustrate the present application, but not to limit the scope of the present application.

[0049] First, some terms involved in this application are explained.

[0050] Neural Networks (NN) model: a complex network system formed by the interconnection of a large number of simple processing units (called neurons), which reflects many basic characteristics of human brain functions and is a highly complex nonlinear deep learning system. The neural network model of deep learning has large-scale parallelism, distributed storage and processing, self-organization, self-adaptation and self-learning capabilities, and can be used for network prediction, such as: network prediction for whether you are on the phone while driving, face recognition For network prediction scenarios such as iden...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method, a device and a storage medium for improving the prediction performance of a depth learning network, belonging to the field of artificial intelligence and computer technology. The method comprises the following steps of: iteratively training a first neural network model by m cycles using a preset training set; verifying the trained neural network model by the preset verification set; determining the corresponding performance indexes based on the scenarios of the neural network mode; selecting multiple neural network models whose performance indexes reach the fusion standard from the neural network models trained by m iterative training; acquiring network parameters of a plurality of selected neural network models, and fusing the network parameters of the plurality of neural network models; assigning the fused network parameters to the second neural network model to obtain a neural network model with the fused network parameters; not only does not increase the training time of the model, but also can improve the network prediction efficiency, and can meet the different needs of the application scenario.

Description

technical field [0001] The present application relates to a method, device and storage medium for improving the prediction performance of a deep learning network, and belongs to the field of artificial intelligence and computer technology. Background technique [0002] Deep learning refers to a collection of algorithms that use various machine learning (Machine Learning) algorithms to solve various problems such as images and texts on a multi-layer neural network. The core of deep learning is feature learning, which aims to obtain hierarchical feature information through layered networks. In order to improve the prediction accuracy of the neural network model, the neural network models in various situations can be fused. [0003] Model fusion includes single model fusion and multi-model fusion. Single model fusion includes multi-layer feature fusion and network snapshot fusion. Multi-layer feature fusion is to fuse features with complementary information in different layer...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V20/597G06N3/045G06F18/24G06F18/25
Inventor 刘通牛群遥朱林孙茂芬章勇曹李军吴仁良杨浩
Owner SUZHOU KEDA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products