Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Body action identification method and system based on depth image induction

A technology of depth images and body movements, applied in the field of human-computer interaction, can solve the problems of eliminating complex and nonlinear fluctuations, and limiting the richness of body movements that can be recognized, so as to improve the recognition efficiency and improve the user experience.

Inactive Publication Date: 2012-12-19
KONKA GROUP
View PDF4 Cites 53 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

From the perspective of the body movement recognition process, traditional movement recognition requires multiple complex steps and processes such as movement modeling, movement segmentation, and movement analysis. Especially for dynamic body movements, different users will perform body movements There are speed differences, trajectory differences, etc., which cause the motion modeling trajectory to cause nonlinear fluctuations on the time axis, and the elimination of such nonlinear fluctuations is very difficult and complicated, so the traditional recognition rate and recognition of body movements based on two-dimensional images Generally not efficient enough
[0006] On the other hand, real user body movements are all made in a 3D environment, and the result based on 2D image processing is to map the user's 3D movements into 2D movements for processing, and it is difficult to obtain real 3D movement information , which to a large extent limits the richness of body movements that can be recognized, and limits the wide application of gesture recognition devices.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Body action identification method and system based on depth image induction
  • Body action identification method and system based on depth image induction
  • Body action identification method and system based on depth image induction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments.

[0043] There are many ways to obtain image depth information, common ones include binocular vision technology, time-of-flight technology, structured light coding technology, etc. Without loss of generality, the present invention uses structured light coding technology as a means to obtain image depth information to describe the invention.

[0044] Such as figure 1 Shown is an electrical structural block diagram of an embodiment of the body movement recognition system of the present invention, and the body movement recognition system based on depth image sensing of the present invention mainly consists of:

[0045] Depth image sensing unit: responsible for emitting the coded structured light plane in the direction of the user, and receiving and sensing the infrared structured light reflected back by the user's environment.

[0046] D...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a body action identification method and a body action identification system based on depth image induction. The body action identification method comprises the following steps: acquiring the depth image information of a user and an environment where the user stands; extracting the body outline of the user from the background of the depth image information; respectively changing the size of each part in the skeletal framework of the human body to be adapted to the body outline of the user, and acquiring the adapted body skeletal framework of the user; tracking and extracting the data which present the movement of the body of the user in a manner adapted to the body skeletal framework; and identifying the body action of the user according to the data which present the movement of the body of the user. According to the invention, the body action of the user is further identified and tracked by establishing the skeletal system of the user, so that the problem existing in the current action induction identification solution is better solved, the body action identification efficiency is improved, and the user experience of human-computer interaction is improved.

Description

technical field [0001] The present invention relates to human-computer interaction technology, in particular, the present invention relates to a body movement recognition method and system based on depth image sensing. Background technique [0002] Because traditional human-computer interaction devices such as mice and keyboards have certain limitations in the naturalness and friendliness of user experience, human-computer interaction technology has become a very popular research field in recent years, and more and more technologies such as touch Various new human-computer interaction methods such as remote control, voice control, gesture control, and motion sensing. In particular, the motion-sensing human-computer interaction method represented by Nintendo's Wii and Sony's MOVE uses various sensor devices to complete the recognition process of body movements in real time, more specifically, upper limb movements, and transform them into game consoles, etc. The command that ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/20G06F3/01
Inventor 陈大炜
Owner KONKA GROUP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products