Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network training method and system and electronic equipment

A technology of deep neural network and training method, applied in the field of deep neural network training method, shampoo, and electronic equipment, which can solve problems such as degradation and inability to improve network performance.

Active Publication Date: 2018-06-12
BEIJING SENSETIME TECH DEV CO LTD
View PDF9 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition to the calculation cost problem, when the network depth is deep, continuing to increase the number of network layers will not improve network performance, but may degrade
In addition, for deep neural networks, due to reasons such as gradient disappearance, how to train a deep neural network has always been a problem that plagues people.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network training method and system and electronic equipment
  • Deep neural network training method and system and electronic equipment
  • Deep neural network training method and system and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0099] Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that unless specifically stated otherwise, the relative arrangement of components and steps, numerical expressions and numerical values ​​set forth in these embodiments do not limit the scope of the present invention.

[0100] At the same time, it should be understood that, for ease of description, the sizes of the various parts shown in the drawings are not drawn in accordance with actual proportional relationships.

[0101] The following description of at least one exemplary embodiment is actually only illustrative, and in no way serves as any limitation to the present invention and its application or use.

[0102] The technologies, methods, and equipment known to those of ordinary skill in the relevant fields may not be discussed in detail, but where appropriate, the technologies, methods, and equipment should be regarded as ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the invention disclose a deep neural network training method and system and electronic equipment. The method comprises the following steps of: in a forward propagation process, carryingout scene analysis detection on a sample image by utilizing a deep neural network model so as to obtain a first scene analysis prediction result output by a middle network layer and a second scene analysis prediction result output by a tail network layer; determining a first difference between the first scene analysis prediction result and scene analysis labeling information of the sample image and a second difference between the second scene analysis prediction result and the scene analysis labeling information of the sample image; and in a counter-propagation process, adjusting parameters of a first network layer according to the first difference and adjusting parameters of the a second network layer according to the first difference and the second difference, wherein the first networklayer comprises at least one network layer between the middle network layer and the tail network layer, and the second network layer comprises other network layers except the first network layer. According to the method and system and the electronic equipment, better network model optimization results can be obtained.

Description

Technical field [0001] The invention relates to computer vision technology, in particular to a deep neural network training method, shampoo and electronic equipment. Background technique [0002] For neural networks, it can be clearly found that the expressive power and performance of the network increase as the depth of the network increases. However, the network is not as deep as possible. In addition to the computational cost, when the network depth is deep, continuing to increase the number of network layers cannot improve network performance, but may degrade instead. In addition, for deep neural networks, how to train a deep neural network has always been a problem that plagues people due to reasons such as the disappearance of the gradient. Summary of the invention [0003] The embodiment of the present invention provides a technical solution for deep neural network training. [0004] According to an aspect of the embodiments of the present invention, a neural network train...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/04G06N3/08
Inventor 石建萍赵恒爽
Owner BEIJING SENSETIME TECH DEV CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products