Information processing method and electronic equipment

An information processing method and technology of electronic equipment, applied in the field of electronics, can solve the problem of inability to correctly recognize personal pronouns of voice commands, and achieve the effect of improving user experience

Active Publication Date: 2014-08-13
LENOVO (BEIJING) CO LTD
View PDF3 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present application provides an information processing method and an electronic device, which are used to solve the technical problem that the electronic device cannot correctly recognize the personal pronouns in the voice command in the prior art, so as to improve the ability of the electronic device to recognize and execute the voice command, thereby improving the user experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Information processing method and electronic equipment
  • Information processing method and electronic equipment
  • Information processing method and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0086] Example 1, see Figure 4 , including the following steps:

[0087] Step 201: the electronic device obtains the input voice of the user: "search my photos";

[0088] Step 202: Recognize the input voice by the voice recognition engine;

[0089] Step 203: recognize that the input voice contains the first type of personal pronoun "I", load the voiceprint extraction module, and extract the voiceprint data from the input voice through the voiceprint extraction module;

[0090] Step 204: The voiceprint recognition module recognizes that the identity corresponding to the voiceprint parameter is "Li Ming" by comparing the voiceprint parameter with the voiceprint feature database;

[0091] Step 205: Determine that "Li Ming" is the referent of "I";

[0092] Step 206: Execute the execution instruction corresponding to the input voice, and search for photos associated with "Li Ming" from the image library. Among them, the generation of the execution instruction can occur at any ...

example 2

[0093] Instance two, see Figure 5 , including the following steps:

[0094] Execute step 207 after the above step 202: recognize that the input voice contains the first type of personal pronoun "I", load the image recognition module, and obtain the first image containing the current user;

[0095] Step 208: Determine that the first image is the referent of "I";

[0096] Step 209: Execute the execution command corresponding to the input voice, extract the facial features from the first image, compare the facial features with the facial features of each photo in the picture library, and search for the matching facial features Photo; wherein, the generation of execution instructions can occur at any moment after step 202 and before step 209 .

example 3

[0097] Example three, see Figure 6 , including the following steps:

[0098] Step 301: When the electronic device is playing the local music, obtain the user's input voice: "send the song to Mr. Li";

[0099] Step 302: Recognize the input voice by the voice recognition engine;

[0100] Step 303: recognize that "Mr. Li" is included in the input voice, and start the image acquisition module;

[0101] Step 304: Obtain N pictures through the image acquisition module, wherein the N pictures can be N frames of images in a video;

[0102] Step 305: Determine a directional pointing gesture through N pictures, wherein the directional pointing gesture can be determined according to the direction of the user's gesture or the direction of finger movement in the N pictures;

[0103] Step 306: Determine the collection position of the image collection module according to the directional pointing gesture, and collect an image at the determined collection position, which is the first image...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an information processing method and electronic equipment and solves the technical problem that personal pronouns in a voice instruction cannot be correctly identified by the existing electronic equipment. The method is applied to the electronic equipment and comprises the steps that input voice is obtained; the input voice is identified by a voice identification engine; when the condition that the input voice comprises the personal pronouns is identified, first data are obtained; a coreference object which is delegated by the personal pronouns is determined on the basis of the first data; an operation instruction is executed on the basis of the coreference object, wherein the operation instruction is an instruction which corresponds to the input voice and is identified by the voice identification engine after the input voice is identified by the voice identification engine.

Description

technical field [0001] The invention relates to the field of electronic technology, in particular to an information processing method and electronic equipment. Background technique [0002] Currently, smart electronic devices such as tablet computers, smart phones, and smart watches can recognize and execute user voice commands, which enriches the way users interact with electronic devices and brings convenience to users. [0003] However, the inventors of the present application have found that the above-mentioned prior art has at least the following technical problems: [0004] The voice command acquired by the electronic device may contain personal pronouns, and it is difficult for the electronic device to determine the referent to which the personal pronoun refers, resulting in that the user's voice command cannot be executed correctly. Contents of the invention [0005] The present application provides an information processing method and an electronic device, which ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
Inventor 杨振奕王科徐琳
Owner LENOVO (BEIJING) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products