Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network model training data processing method, device, electronic equipment and storage medium

A neural network model and training data technology, applied in the field of machine learning, can solve the problems of high hardware cost for training data training and data storage, complex training data processing process, and difficulty in deploying training data, so as to enhance generalization ability and realize large-scale Scale application, reduce the effect of overfitting

Pending Publication Date: 2021-10-29
TENCENT TECH (SHENZHEN) CO LTD
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it is very difficult to deploy such large-scale training data on resource-constrained embedded systems, and the performance of neural networks obtained directly from small-scale training data is much lower than that of larger-scale networks.
Related technologies can achieve data compression through data distillation when processing training data, but distillation of unlabeled data cannot be achieved, which makes the processing of training data complicated, and at the same time, the large amount of training data makes the training of neural network models difficult. The cost of hardware and data storage is too high, which is not conducive to large-scale promotion and use

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network model training data processing method, device, electronic equipment and storage medium
  • Neural network model training data processing method, device, electronic equipment and storage medium
  • Neural network model training data processing method, device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] In order to make the objects, technical solutions and advantages of the present invention, the present invention will be further described in connection with the accompanying drawings, and the described embodiments are not to be considered as limiting the invention, and those of ordinary skill in the art are not made All other embodiments obtained under the premise of creative labor belong to the scope of the invention.

[0062] In the following description, "some embodiments" describe the subset of all possible embodiments, but it can be understood that "some embodiments" may be the same subset or different subset of all possible embodiments, and It can be combined with each other without conflict.

[0063] The nouns and terms involved in the embodiments of the present invention will be described with reference to the embodiments of the present invention, and the nouns and terms involved in the embodiments of the present invention are suitable for use in the following expl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network model training data processing method and device, equipment and a storage medium, and the method comprises the steps: carrying out the rotation self-supervision processing of a first training data set, so as to form a corresponding second training data set; determining an initial training data set corresponding to the second training data set; according to the second training data set and the initial training data set, gradient parameters matched with the initial training data set are determined through the first training data processing network and the second training data processing network; and updating the initial training data set, and determining a target training data set matched with the target neural network model. Therefore, on the premise of reducing the total amount of the training data, the accuracy of training the neural network model by the training data with a small scale can be stably improved, the generalization ability of the target neural network model is enhanced, the trained target neural network model can be conveniently deployed in the mobile terminal, and large-scale application of the target neural network model is realized.

Description

Technical field [0001] The present invention relates to machine learning techniques, and more particularly to training data processing, devices, electronic devices, and storage media for neural network models. Background technique [0002] In artificial intelligence technology, the depth neural network has achieved very good performance in many computer visual tasks. In general, the larger number of training data is, the better the performance of the trained neural network model. However, deploying such larger training data on resource-limited embedded systems is very difficult, and the performance of neural networks obtained directly through larger training data is much lower than the performance of larger networks. Related technologies can implement compression of data by data distillation when the training data is processed, but the distillation of the non-standard note data is not realized, so that the processing of training data is complex, and the number of training data is...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/08
CPCG06N3/08G06F18/214
Inventor 牛帅程吴家祥谭明奎
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products