Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Human body action recognition method and mobile intelligent terminal

A technology for human action recognition and human action, applied in the field of action recognition, can solve the problems of high equipment posture requirements, poor user experience, and inconvenience for users to achieve the effects of low power consumption, noise removal, and complexity reduction

Active Publication Date: 2015-12-23
GOERTEK INC
View PDF9 Cites 54 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] (3) The existing technology generally requires operation on a given smart terminal posture or on a fixed plane, which limits the range of user actions and requires a higher posture of the device, which causes great inconvenience to the user. poor user experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method and mobile intelligent terminal
  • Human body action recognition method and mobile intelligent terminal
  • Human body action recognition method and mobile intelligent terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The main idea of ​​the embodiment of the present invention is: in view of the problems existing in the existing sensor-based human motion recognition scheme, the embodiment of the present invention collects human motion data in advance for training, obtains feature extraction parameters and template data sequences, and uses the feature extraction parameters Reduce the data dimension of the test data sequence, compared with the existing scheme of directly operating on the collected high-dimensional data to recognize human actions, it reduces the requirements for equipment posture when performing human actions, removes noise, and will The data sequence after dimensionality reduction is matched with the template data sequence, which can realize accurate human action recognition while reducing computational complexity.

[0057] The human body action recognition method of the embodiment of the present invention can be applied in a mobile smart terminal, figure 1 It is a flow...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body action recognition method and a mobile intelligent terminal. The human body action recognition method comprises the steps that human body action data are acquired for training so that feature extraction parameters and template data sequences are obtained, and the data requiring performance of human body action recognition are acquired in one time of human body action recognition so that original data sequences are obtained; feature extraction is performed on the original data sequences by utilizing the feature extraction parameters, and the data dimension of the original data sequences is reduced so that test data sequences after dimension reduction are obtained; and the test data sequences and the template data sequences are matched, and generation of human body actions corresponding to the template data sequences to which the test data sequences are correlated is confirmed when the successfully matched test data sequences exist. Dimension reduction is performed on the test data sequences so that the requirements for the human body action attitudes are reduced, and noise is removed. Then the data after dimension reduction are matched with the templates so that calculation complexity is reduced, accurate human body action recognition is realized and user experience is enhanced.

Description

technical field [0001] The invention relates to the technical field of action recognition in human-computer interaction, in particular to a human action recognition method and a mobile intelligent terminal. Background technique [0002] At present, gesture recognition schemes in human-computer interaction systems can be mainly divided into two categories: vision-based schemes and sensor-based schemes. Vision-based gesture recognition research is earlier, and the recognition method is relatively mature, but this scheme has disadvantages such as being sensitive to the environment, complex system, and large amount of calculation. Although sensor-based gesture recognition started late, it is flexible and reliable, not affected by the environment and light, and easy to implement. It is a recognition method with development potential. The essence of gesture recognition is to use gesture recognition algorithms to classify gestures according to gesture models. The quality of gestu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00
CPCG06V40/20G06F18/214
Inventor 苏鹏程
Owner GOERTEK INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products