Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for reducing transmission consumption of neural network model update on mobile device

A neural network model and mobile device technology, applied in the sharing field of mobile devices, can solve problems such as loss of relearning ability, and achieve the effect of retraining learning and reducing transmission consumption.

Active Publication Date: 2020-10-02
CHONGQING UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the mobile terminal can execute the neural network, the mobile terminal still needs a powerful server in the cloud to complete the training of the neural network, so the neural network model transplanted to the mobile terminal is solidified and loses the ability to re-learn

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for reducing transmission consumption of neural network model update on mobile device
  • Method for reducing transmission consumption of neural network model update on mobile device
  • Method for reducing transmission consumption of neural network model update on mobile device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Below in conjunction with accompanying drawing and embodiment the present invention will be further described:

[0027] The operating environment of the present invention is: an embedded mobile device, which can configure a neural network to perform some intelligent identification tasks; and a cloud server, which can train the neural network.

[0028] Such as figure 1 Shown, the present invention comprises the following steps:

[0029] Step 1. The mobile device selects data with low prediction confidence and uploads it to the cloud;

[0030] When the neural network predicts the data (such as identifying image information), it outputs the probabilities of multiple categories that the predicted data belongs to. These probabilities represent the confidence of each category of the predicted data; the highest probability category is selected as the final category it belongs to. When the output category probabilities are all small, the neural network is very uncertain about...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for reducing transmission consumption of neural network model updates at a mobile equipment side. The method comprises the following steps: step 1, mobile equipment selects predicted picture information to upload to the cloud for relearning, and thus the consumption cost of uploading data can be reduced, and the performance improvement brought by relearning cannot be influenced; step 2, the new data uploaded by the mobile equipment is mixed with an old data set to form a larger training set at the cloud, a neural network model is retrained to achieve the learning of the new data, and the problem of catastrophic forgetting in incremental learning of neural networks can be avoided; and step 3, a small amount of weight in a new model is extracted at the cloud and is transmitted to the mobile equipment, an old model that has been deployed on the mobile equipment is updated to ensure that the old model on the mobile equipment can reach the identification performance of the new model, and the cost of data transmission generated when updating old neural network models in the mobile equipment can be reduced. The method disclosed by the invention has the technical effect of effectively reducing the transmission consumption cost of the neural network model updates at the mobile equipment side.

Description

technical field [0001] The invention belongs to the technical field of mobile equipment sharing, and in particular relates to a method for reducing the transmission consumption of neural network model update at the mobile equipment end. Background technique [0002] Neural networks include convolutional neural networks, recurrent networks, etc. At present, neural networks have been applied to different large-scale machine learning problems such as speech recognition, image recognition, and natural speech processing. Internet has become an inevitable trend. However, due to the large amount of calculation and large memory consumption of the neural network, the use and training of the neural network on the mobile terminal cannot be undertaken. The task is sent to the cloud for execution, and the result is returned to the mobile terminal. However, this method requires the device to upload data to the cloud and download results from the cloud in real time, so there is network ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08G06N3/08
CPCG06N3/08H04L67/06H04L67/10
Inventor 刘铎李世明向超能梁靓
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products