Improved artificial neural network method, electronic device for language modeling and prediction

A technology of artificial neural network, electronic device, applied in the field of improved artificial neural network, electronic device for language modeling and prediction

Active Publication Date: 2022-04-01
MICROSOFT TECH LICENSING LLC
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The reduction of available resources on mobile devices not only prevents large and complex applications including ANNs from performing at an acceptable performance level, but their large size also prevents end users from installing applications on their limited storage devices

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Improved artificial neural network method, electronic device for language modeling and prediction
  • Improved artificial neural network method, electronic device for language modeling and prediction
  • Improved artificial neural network method, electronic device for language modeling and prediction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] figure 1 A simple ANN 100 according to the prior art is depicted. In essence, an artificial neural network such as ANN 100 is a chain of mathematical functions organized in direction-dependent layers such as input layer 101, hidden layer 102, and output layer 103, each layer comprising a plurality of units or nodes 110-131. The ANN 100 is called a "feed-forward neural network" because the output of each layer 101-103 is used as the input to the next layer (or the output of the ANN 100 in the case of the output layer 103), and there is no reverse step or loop. It should be understood that figure 1 The number of units 110-131 depicted in is exemplary, and a typical ANN includes many more units in each layer 101-103.

[0039] In operation of the ANN 100 , an input is provided at the input layer 101 . This typically involves mapping the real-world input into a discrete form suitable for the input layer 101 (i.e., each unit 110-112 that can be input to the input layer 10...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention relates to an improved artificial neural network for predicting one or more next items in a sequence of items based on an input sequence of items. The improved artificial neural network dramatically reduces memory requirements, making it suitable for use in electronic devices such as mobile phones and tablets. The invention includes an electronic device on which an improved artificial neural network operates, and a method of using the improved artificial neural network to predict one or more next items in a sequence.

Description

Background technique [0001] Modern mobile electronic devices, such as mobile phones and tablet computers, typically receive typed user input via soft keyboards, which include various additional functions beyond simply receiving keyboard input. One of these additional functions is the ability to predict the next word the user will enter via the keyboard given the previous word or entered word(s). The predictions are typically generated using n-gram based predictive language models such as those described in detail in European Patent No. 2414915. [0002] One of the often criticized shortcomings of n-gram based predictive language models is that they only rely on the statistical dependencies of the previous few words. In contrast, artificial neural network (ANN) and recurrent neural network (RNN) language models have been shown in the art to perform better than n-gram models in language prediction (Recurrent Neural Network Based Language Model, Mikolov et al, 2010; RNNLM-Recur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04
CPCG06N3/084G06N3/044G06N3/04G06N3/088
Inventor M·雷伊M·J·威尔森
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products