Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method and apparatus of neural network, and object identification method and apparatus

A technology of neural network and training method, which is applied in the direction of neural learning method, biological neural network model, character and pattern recognition, etc. It can solve the problems of inapplicable central processing unit, large memory usage, small memory space, etc., and achieves small memory usage , improve adaptability, the effect of less parameters

Inactive Publication Date: 2018-08-17
SHANGHAI XPARTNER ROBOTICS
View PDF5 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the current neural network model needs to use a corresponding computing framework. By arranging multiple high-performance graphics processors (GPU, Graphics Processing Unit), and occupying a large amount of memory, it cannot be applied to a traditional central processing unit with a small memory space. Mobile terminals such as smartphones or tablets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method and apparatus of neural network, and object identification method and apparatus
  • Training method and apparatus of neural network, and object identification method and apparatus
  • Training method and apparatus of neural network, and object identification method and apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] figure 1 It is a flow chart of a neural network training method provided by Embodiment 1 of the present invention. This embodiment is applicable to the situation of establishing and training a neural network model with small memory and fast calculation speed. This method can be performed by a server Execution, the server may be implemented in software and / or hardware. The method specifically includes:

[0054] S110. Establish an initial neural network model, where the initial neural network model is an ultra-lightweight network SqueezeNet model.

[0055] Among them, the ultra-lightweight network SqueezeNet model has the characteristics of few model parameters and small memory usage. Compared with the neural network models such as AlexNet, under the premise of the same recognition accuracy, the ultra-lightweight network SqueezeNet model occupies less memory. nearly 50 times. In this embodiment, an ultra-lightweight network SqueezeNet model is established to reduce the...

Embodiment 2

[0078] figure 2 It is a flow chart of an object recognition method provided by Embodiment 2 of the present invention. This embodiment is applicable to the situation where a mobile terminal equipped with a neural network model with a small memory footprint quickly recognizes an object. This method can be executed by a mobile terminal. Wherein the mobile terminal may be a smart phone, a tablet computer or a robot provided with a traditional central processing unit, and the mobile terminal may be realized by means of software and / or hardware. The method specifically includes:

[0079] S210. Acquire an image of the object to be recognized.

[0080] Wherein, the image may be obtained by photographing the object to be recognized, or obtained by means of network resources or cloud data.

[0081] S220. Perform preprocessing on the image of the object to be recognized according to the preprocessing parameters of the target neural network model.

[0082] In this embodiment, after ac...

Embodiment 3

[0094] image 3 It is a schematic structural diagram of a neural network training device provided in Embodiment 3 of the present invention, and the device includes:

[0095] The model building module 310 is used to set up an initial neural network model, wherein the initial neural network model is an ultra-lightweight network SqueezeNet model;

[0096] The first preprocessing module 320 is used to acquire training samples and preprocess the training samples;

[0097] The sample input module 330 is used to input the training samples obtained by preprocessing into the initial neural network model, and determine the first output and preset expected loss value according to the initial neural network model;

[0098] The model training module 340 is configured to adjust the network parameters of the initial neural network model according to the loss value to generate a target neural network model.

[0099] Optionally, the initial neural network model includes a preset number of st...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a training method and apparatus of a neural network, and an object identification method and apparatus. The training method of a neural network includes the steps: establishing an initial neural network model, wherein the initial neural network model, is an ultra lightweight network SqueezeNet model; acquiring a training sample, and preprocessing the training sample; inputting the training sample which is obtained through preprocessing into the initial neural network model, and according to the first output of determining the loss value of the first output and the preset expectation according to the initial neural network model; and according to the loss value, adjusting the network parameters of the initial neural network model, and generating a target neural network model. The training method of a neural network establishes a high precision target neural network with few parameters and small occupied memory, so as to enable the target neuralnetwork model which is obtained through training to be applied to a mobile terminal, without depending on setting of a GPU (Graphics Processing Unit), thus improving adaptability of the target neuralnetwork model.

Description

technical field [0001] Embodiments of the present invention relate to image processing technology, and in particular to a neural network training method, object recognition method and device. Background technique [0002] With the rapid development of deep learning technology, deep convolutional neural networks have achieved major technological breakthroughs and rapid development in the fields of object recognition, single object positioning, multi-object detection, and image semantics and instance segmentation. In the top international large-scale object recognition competition, the recognition accuracy of object recognition technology based on deep convolutional neural network has reached a recognition rate that surpasses that of humans. [0003] However, the current neural network model needs to use a corresponding computing framework. By arranging multiple high-performance graphics processors (GPU, Graphics Processing Unit), and occupying a large amount of memory, it can...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06K9/62
CPCG06N3/084G06F18/214
Inventor 恽为民任翰驰夏晓斌庞作伟
Owner SHANGHAI XPARTNER ROBOTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products