Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Electronic equipment, task processing method and neural network training method

A neural network and task processing technology, applied in biological neural network models, neural architectures, neural learning methods, etc., can solve the problems of high computational cost and storage cost of neural networks, high task processing complexity, and reduce computational costs and storage costs. Cost and complexity reduction

Pending Publication Date: 2020-02-04
BEIJING SAMSUNG TELECOM R&D CENT +1
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For a task, the size of the neural network model corresponding to the task is the sum of the model sizes of each subtask. In addition, the task often needs to support multiple languages. For tasks that support multiple languages, the model size and calculation amount will increase with the number of supported languages. Multiplied, resulting in a higher computational cost and storage cost of the neural network, which in turn leads to a higher complexity of task processing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Electronic equipment, task processing method and neural network training method
  • Electronic equipment, task processing method and neural network training method
  • Electronic equipment, task processing method and neural network training method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0168] The embodiment of the present application provides a method for task processing based on a neural network. The method includes: obtaining input data corresponding to the task, and extracting feature information corresponding to each neural network base from the input data corresponding to the task, for example, neural network The feature information corresponding to network base 1, the feature information corresponding to neural network base 2...the feature information corresponding to neural network base n, and the extracted feature information is input to the corresponding neural network base, and then the output of each neural network base The results are weighted and summed based on the weight information of each neural network base corresponding to each subtask of the task or task, and the corresponding processing results of the task or subtask are obtained. For example, when the task contains m subtasks, the output results of each neural network base are based on S...

Embodiment 2

[0224] Another possible implementation of the embodiment of the present application, on the basis of the first embodiment, also includes the operations shown in the second embodiment, wherein,

[0225] This embodiment at least introduces the process of online adaptive learning (online update), wherein the process of online adaptive learning includes: updating the neural network base and / or updating the neural network weight information corresponding to the task , if the task contains subtasks, updating the neural network weight information corresponding to the task includes updating the neural network base weight information corresponding to at least one subtask, specifically as follows including step S406 (not marked in the figure), wherein step S406 (not marked in the figure) can be performed before any of step S401, step S402, step S403, step 404 and step S405, can also be performed after any of the steps, and can also be performed simultaneously with any of the steps.

[0...

Embodiment 3

[0273]This embodiment can be implemented on the basis of any one of Embodiments 1 to 2, and can also be implemented independently of the above embodiments. If this embodiment is implemented on the basis of Embodiments 1 to 2, then this embodiment Step S501-step S504 in the example can be carried out before step S401, as Figure 5a As shown, wherein, the operations performed by step S505-step S509 are similar to the operations performed by step S401-step S405, and will not be repeated here; The execution sequence is as follows Figure 5b As shown, among them,

[0274] Embodiment 3 is the process of training the neural network base offline, such as Figure 6 As shown, it mainly includes collecting training data, extracting features from the training data, training the constructed neural network base based on the extracted features, and then training the weight of the neural network base corresponding to the task, as follows:

[0275] Step S501, constructing each neural networ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides electronic equipment, a task processing method and a neural network training method. The method comprises the steps of obtaining input data corresponding to atask, then performing corresponding processing on the input data through each neural network corresponding to the task, and then determining a processing result corresponding to the task based on a processing result of each neural network and neural network weight information corresponding to the task. According to the embodiment of the invention, the calculation cost and the storage cost of the neural network are reduced, and the task processing complexity is further reduced.

Description

technical field [0001] The present application relates to the technical field of machine learning, and in particular, the present application relates to an electronic device, a method for task processing and a method for training a neural network. Background technique [0002] With the development of information technology, the field of machine learning has also developed. In the field of machine learning, neural network methods have attracted more and more attention from researchers and have been used more and more widely. In machine learning tasks, the task is usually decomposed into several subtasks for completion. For example, a complete voice wake-up task can be decomposed into automatic anti-spoofing speech recognition subtasks (optional), voice activity point detection subtasks, language Recognition subtask (optional), keyword detection subtask and speaker recognition subtask, its processing flow is as follows figure 1 shown. In the prior art, a corresponding neural...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 王宪亮楼晓雁韩文静朱璇王仁宇童颖宋黎明
Owner BEIJING SAMSUNG TELECOM R&D CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products