Lip-reading recognition method and mobile terminal

A mobile terminal and recognition method technology, applied in speech recognition, neural learning methods, character and pattern recognition, etc., can solve problems such as not suitable for answering calls, unable to protect user privacy, and normal activities of surrounding people, so as to improve training Accuracy, training time savings, impact reduction effects

Active Publication Date: 2018-06-22
BOE TECH GRP CO LTD
View PDF7 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The inventors found that in actual conversations, on the one hand, the content of mobile phone communication is private content in many cases, and the privacy of users cannot be protected by making voice calls involving private content; Suitable for answering the phone, for example: during a meeting or in the library, if a voice call is made, it will inevitably affect the normal activities of the surrounding people

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Lip-reading recognition method and mobile terminal
  • Lip-reading recognition method and mobile terminal
  • Lip-reading recognition method and mobile terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0055]figure 1 The flowchart of the lip recognition method provided by the embodiment of the present invention, such as figure 1 As shown, the lip language recognition method provided by the embodiment of the present invention is applied in a mobile terminal, wherein a voiced mode and a silent mode are set in the mobile terminal, and the method specifically includes the following steps:

[0056] Step 100, in the vocal mode, train the deep neural network.

[0057] Specifically, the voice mode refers to that the user makes a voice call.

[0058] As a first alternative, step 100 includes: collecting lip images for training and corresponding voice data; obtaining corresponding image data according to the lip images for training, and training deep neural networks based on the image data and voice data. The internet.

[0059] As a second optional method, step 100 includes: collecting lip images for training and corresponding voice data; obtaining corresponding image data according...

Embodiment 2

[0078] Based on the inventive concepts of the foregoing embodiments, figure 2 A schematic structural diagram of a mobile terminal provided by an embodiment of the present invention, such as figure 2 As shown, the mobile terminal provided by the embodiment of the present invention is provided with a voiced mode and a silent mode, and the mobile terminal includes: an acquisition module 10 and a processing module 20 .

[0079] Specifically, in the silent mode, the collection module 10 is configured to collect the user's lip image; the processing module 20 is connected in communication with the collection module 10, and is configured to identify the content corresponding to the lip image according to the deep neural network.

[0080] Among them, the deep neural network is established in the vocal mode.

[0081] It should be noted that the condition for starting the silent mode is a lip recognition start command input by the user, such as clicking a preset virtual button on the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An embodiment of the invention provides a lip-reading recognition method and a mobile terminal. The lip-reading recognition method is applied to the mobile terminal. The mobile terminal is provided with vocal modes and silent modes by means of setting. Deep neural networks are trained in the vocal modes. In the silent modes, the lip-reading recognition method includes starting the silent modes; acquiring lip images of users; recognizing contents according to the deep neural networks. The contents correspond to the lip images. The deep neutral networks are established in the vocal modes. According to the technical scheme, the lip-reading recognition method and the mobile terminal in the embodiment of the invention have the advantages that the deep neural networks are trained in the vocal modes, the contents corresponding to the lip images are recognized in the silent modes by the aid of the deep neural networks trained in the vocal modes, and accordingly the technical problems of incapability of protecting the privacy and influence on surrounding personnel due to vocal conversation of existing users in the prior art can be solved; the privacy of the users can be protected, influenceon surrounding people can be reduced, the training time further can be saved, and the training accuracy can be improved.

Description

technical field [0001] The embodiment of the present invention relates to the technical field of mobile communication, and in particular to a lip language recognition method and a mobile terminal. Background technique [0002] At present, mobile terminals such as mobile phones and tablet computers with a calling function all need local users to make voice calls during actual calls. [0003] The inventors found that in actual conversations, on the one hand, the content of mobile phone communication is private content in many cases, and the privacy of users cannot be protected by making voice calls involving private content; Suitable for answering the phone, for example: during a meeting or in a library, if a voice call is made, it will inevitably affect the normal activities of the surrounding people. Contents of the invention [0004] In order to solve the above technical problems, the embodiment of the present invention provides a lip language recognition method and a mo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/08G06V10/764
CPCG06N3/08G06V40/20G06V40/176G06V10/82G06V10/764G06F3/165G10L13/0335G10L15/02G10L15/063G10L15/16G10L15/22G10L15/25H04R1/08H04R1/10H04R2499/11G06V40/171G06F18/214
Inventor 耿立华马希通张治国
Owner BOE TECH GRP CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products