Unlock instant, AI-driven research and patent intelligence for your innovation.

Condition calculation for continuous learning

A technology of computing equipment and functional components, applied in the field of conditional computing for continuous learning, capable of solving problems such as data unavailability for the next task

Pending Publication Date: 2022-06-21
QUALCOMM INC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since traditional continual learning occurs in a sequential manner, after the first task is learned, the data from this first task is not available for the next task

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Condition calculation for continuous learning
  • Condition calculation for continuous learning
  • Condition calculation for continuous learning

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0132] Example 1. A method for learning in a neural network, comprising: receiving, by a processor in a computing device, input in a layer in the neural network, the layer including a plurality of filters; the received input and a first task to determine a first series of filters of the plurality of filters to be applied to the received input; and applying, by the processor, the first series of filters to the received input to An activation of the first task is generated.

example 2

[0133] Example 2. The method of example 1, further comprising: determining, by the processor, upon completion of the first task, a first set of significant filters in the first series of filters; and fixing, by the processor, the first set of significant filters The weight parameters of a set of important filters such that the weight parameters of the first set of important filters cannot be updated during execution of tasks other than the first task.

example 3

[0134] Example 3. The method of Example 2, further comprising: reinitializing, by the processor, weights of all filters of the plurality of filters not included in the first set of significant filters before performing the next task parameter.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Aspects provide methods for learning, such as continuous learning, that support task incremental learning using a multi-head classification architecture. Various aspects may enable conditional calculations to support multi-head classification. Aspects provide methods for learning, such as continuous learning, that support class incremental learning using a single-head classification architecture. Various aspects may enable conditional calculations to support single head classification by predicting tasks associated with a given test input and selecting an associated classification head based at least in part on the task prediction.

Description

[0001] Related applications [0002] This application claims the benefit of priority to U.S. Provisional Application No. 62 / 935,147, filed November 14, 2019, entitled "Conditional Computation For Continual Learning," the entire contents of which are owned by all Purpose is incorporated herein by reference. Background technique [0003] Deep neural networks are heavily used in computing devices for a variety of tasks, including scene detection, facial recognition, image classification, and labeling. These networks use a multi-layered architecture, where each layer receives input, performs computation on the input, and generates an output or "activation". The outputs or activations of the nodes in the first layer become the inputs of the nodes in the second layer, the activations of the nodes in the second layer become the inputs of the nodes in the third layer, and so on. Therefore, the computation in the deep neural network is distributed over the entire processing nodes tha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/084G06N3/048G06N3/045G06F18/2414G06N3/08G06N3/063G06F18/285G06N3/0464G06N3/0495G06N20/10
Inventor D·阿巴蒂B·艾特沙米·贝诺狄J·M·汤姆扎克T·P·F·布兰克沃特
Owner QUALCOMM INC