Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A 3D gesture motion reconstruction method and system

A technology of three-dimensional gestures and gestures, which is applied in image data processing, instruments, calculations, etc., can solve problems such as local extremum and long algorithm running time, and achieve the effects of comprehensive information, reduced complexity, and convenient use

Active Publication Date: 2011-11-30
TSINGHUA UNIV
View PDF2 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are still the following problems: 1) The reconstruction work needs to obtain a relatively accurate 3D initial model; 2) The reconstruction work usually uses a local optimization algorithm to optimize in a small range, which is likely to fall into the situation of local extremum; 3) The reconstruction work Even if there is a globally optimized algorithm, the running time of the algorithm is usually long

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A 3D gesture motion reconstruction method and system
  • A 3D gesture motion reconstruction method and system
  • A 3D gesture motion reconstruction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] figure 1 It is a flow chart of the three-dimensional gesture motion reconstruction method according to Embodiment 1 of the present invention, refer to below figure 1 , detail each step.

[0035] Step S110 , perform region segmentation on the first frame image of the collected gesture image sequence according to a given reference image.

[0036] In this embodiment, the system pre-stores a reference image of a gesture, such as figure 2 As shown in a and b, the reference image performs area segmentation on the gesture, and divides each area of ​​the hand by color. This embodiment has certain requirements on the first frame of the gesture image sequence collected by the user. The first frame of the image sequence is required to be a pattern of an open hand, so that various parts of the hand can be reflected. Preferably, the hand's The pattern is free of self-occlusion and external occlusion, and the fingers are facing up, the palm or the back of the hand is facing the c...

Embodiment 2

[0074] Figure 8 It is a schematic structural diagram of a three-dimensional gesture motion reconstruction system according to Embodiment 2 of the present invention, according to the following Figure 8 Describe in detail the components of the system.

[0075] The system is used to execute the method of Embodiment 1 of the present invention, and includes the following units:

[0076] A region segmentation unit, which performs region segmentation on the first frame image of the collected gesture image sequence according to a given reference image;

[0077] an affine transformation matrix generating unit, which generates an affine transformation matrix of each segmented region of the first frame image;

[0078] an initial model generation unit, which generates three-dimensional to two-dimensional projection coefficients, and obtains a three-dimensional gesture model corresponding to the first frame image according to the projection coefficients and the affine transformation ma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a system for restructuring the motion of a three-dimensional gesture. The method comprises the following steps of: performing region segmentation on a first frame image of an acquired gesture image sequence; generating an affine transformation matrix of each segmentation region; generating projection coefficient from three dimensions to two dimensions; obtaining a three-dimensional gesture model corresponding to the first frame image according to the projection coefficient and the affine transformation matrix; determining a framework node of the obtained three-dimensional gesture model corresponding to the first frame image and the freedom degree of the framework node; and performing simulated annealing particle filtering operation by combining the current frame image based on the framework node and the freedom degree of the framework node of the three-dimensional gesture model corresponding to a front frame image in allusion to follow frame imagesto obtain the three-dimensional gesture model of the current frame image, and thus realizing the restructuring of the three-dimensional gesture. According to the method and the system, the accurate modeling of a three-dimensional hand by using a laser scanner is not required, and the complexity of the motion restructuring process is reduced.

Description

technical field [0001] The invention relates to the fields of computer vision and computer graphics, and in particular, relates to a motion reconstruction technology of a three-dimensional gesture. Background technique [0002] Motion reconstruction of motor gestures has been a concern of researchers, because gestures usually play an important role in the communication of information between people. In virtual reality, the hand is a very important action and perception relationship model in the user model, and human behavior characteristics are an important research content of human-computer interaction. In the virtual environment, three-dimensional interactive tasks and technologies such as grasping and releasing objects, flying, roaming, and navigation are realized by hand. In the past, human-computer interaction was obtained by using human touch behavior and computer response. In the process of human-computer interaction, direct interaction with computer systems through ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
Inventor 刘烨斌王雁刚戴琼海
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products