Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for identifying user action and intelligent mobile terminal

A user and action technology, applied in the field of human-computer interaction, can solve problems such as poor user experience, limited range of user actions, and high requirements for equipment posture

Active Publication Date: 2016-01-13
GOERTEK INC
View PDF3 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing technology generally requires operation on a given smart terminal or a fixed plane, which limits the range of user actions and requires high equipment posture, which causes great inconvenience to users and poor user experience.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for identifying user action and intelligent mobile terminal
  • Method for identifying user action and intelligent mobile terminal
  • Method for identifying user action and intelligent mobile terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0087] The main concept of the present invention is: for the problems existing in the existing sensor-based user action recognition scheme, the embodiment of the present invention collects user action data in advance for training, obtains feature extraction parameters and template symbol sequences, and uses the feature extraction parameters to reduce the test The data dimension of the data sequence (for example, reducing the three-dimensional acceleration data to one dimension), compared with the existing scheme of directly operating on the collected high-dimensional data to identify, removes noise, reduces computational complexity and Requirements for device posture when the user performs an action. Furthermore, by symbolizing the reduced-dimensional low-dimensional data sequence into a string sequence, the noise in the data sequence can be further removed, the amount of calculation can be reduced, and the recognition accuracy can be improved. Finally, the character string se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a method for identifying a user action and an intelligent mobile terminal. The method comprises: acquiring user action data, and training the user action data to obtain a feature extraction parameter and a template symbol sequence; collecting data that needs to execute user action identification during user action identification to obtain an original data sequence; performing feature extraction on the original data sequence by using the feature extraction parameter to obtain a dimension-reduced test data sequence; converting the test data sequence into a discrete character string to obtain a symbol sequence of the test data sequence; and performing matching on the symbol sequence of the test data sequence and a template symbol sequence, and when the matching is successful, determining that the user action corresponding to the template symbol sequence occurs. According to the method provided by an embodiment of the present invention, dimension reduction is performed on the original data sequence by using the feature extraction parameter, the dimension-reduced data sequence is symbolized, and then the dimension-reduced data sequence is matched with the template symbol sequence, so that calculation complexity is lowered, perceptual identification efficiency is improved, and good user experience is achieved.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a method for recognizing user actions and a mobile intelligent terminal. Background technique [0002] Gestures are a natural and intuitive way of interaction. Simple gestures can express various meanings depending on the environment. Applying gestures to human-computer interaction can effectively improve interaction efficiency and user experience. For example, gestures are applied to smart terminal devices such as smart watches and smart bracelets. When the user raises his hand, the system can automatically detect and trigger corresponding operations (such as realizing functions such as raising the hand of the smart watch to brighten the screen) to realize Intelligent interactive operation. [0003] At present, gesture recognition schemes in human-computer interaction systems can be mainly divided into two categories: vision-based schemes and sensor-based sc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
Inventor 苏鹏程
Owner GOERTEK INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products