Method for capturing human motion by aid of fused depth images and three-dimensional models

A three-dimensional model, human motion technology, applied in the field of human motion capture, can solve problems such as large error, high distortion of capture effect, and limited motion range.

Inactive Publication Date: 2015-01-07
XIAN TECH UNIV
View PDF1 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a human body motion capture method that combines depth maps and three-dimensional models, which can effectively overcome the technical defects of limited motion range, high capture effect distortion and large errors in existing motion capture methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for capturing human motion by aid of fused depth images and three-dimensional models
  • Method for capturing human motion by aid of fused depth images and three-dimensional models
  • Method for capturing human motion by aid of fused depth images and three-dimensional models

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The present invention will be described in detail below in combination with specific embodiments.

[0073] The human motion capture method of fusion depth map and three-dimensional model involved in the present invention is realized by the following steps:

[0074] Step 1: Use relevant equipment to collect depth information of human body movements, remove the background of moving objects, and obtain complete depth information of human body movements.

[0075] Based on the collected depth information of human body movements, the specific steps for removing the moving target background and obtaining complete depth information of human body movements are as follows:

[0076] (1) The depth map of a human body movement collected is represented by F(x,y,d), where x and y are the abscissa and ordinate in the pixel coordinate system, respectively, and d is the depth information; it is assumed that the background is segmented based on the depth information The thresholds for th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a method for capturing human motion by the aid of fused depth images and three-dimensional models. By the aid of the method, problems of adhesion requirement on mark points, inconvenience in interaction and easiness in confusing and shielding the mark points of a method for optically capturing motion can be solved. The method includes acquiring depth information of human actions; removing backgrounds of motion objects; acquiring complete depth information of the human actions; converting the complete depth information of the human actions into three-dimensional point-cloud information of human bodies; acquiring three-dimensional human action models; establishing databases; enabling the databases to be in one-to-one correspondence with data of human action skeleton databases; extracting depth information of to-be-identified human actions to build the three-dimensional models; then matching the similarity of the three-dimensional models with the similarity of human actions in the databases of the three-dimensional models; outputting human action skeletons according to similarity sequences. The human action skeletons are motion capturing results. The method has the advantages that sensors or added mark points can be omitted on the human bodies, the method is easy to implement, motion sequences are matched with one another by the aid of a regular time wrapping process, the matching precision of each two sequences can be improved, the matching time can be greatly shortened, and the motion capturing speed and the motion capturing precision can be guaranteed.

Description

technical field [0001] The invention belongs to the technical field of multimedia information retrieval, and in particular relates to a human body motion capture method that combines a depth map and a three-dimensional model. Background technique [0002] Human body motion capture technology is a hot issue in the field of multimedia information retrieval, especially in the development of film and television animation, games, etc., and has broad application prospects. Many research institutions at home and abroad are working in this direction. In recent years, with the rapid development of motion capture technology and the rise of 3D film and television animation, games, and new-generation human interaction, many complex and lifelike human movements need to be quickly captured and applied, and a fast and effective method is needed. Capture of human motion. The optical motion capture method that has been proposed so far is mainly based on the principle of computer vision, and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/00
CPCG06T7/285
Inventor 肖秦琨谢艳梅
Owner XIAN TECH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products