Unlock instant, AI-driven research and patent intelligence for your innovation.

Gesture interaction method based on joint point transformation

A joint point and gesture technology, applied in the field of three-dimensional gesture interaction, can solve the problems of expensive equipment and poor human hand model authenticity, and achieve the effects of less calculation, considerable recognition effect, and simple method.

Active Publication Date: 2015-11-11
ZHONGBEI UNIV
View PDF1 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the technical problems of the existing three-dimensional gesture interaction method that the equipment used is expensive and the authenticity of the obtained human hand model is poor, and to provide a gesture interaction method based on joint point transformation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture interaction method based on joint point transformation
  • Gesture interaction method based on joint point transformation
  • Gesture interaction method based on joint point transformation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] A gesture interaction method based on joint point transformation in this embodiment includes the following steps:

[0031] 1) Place the Kinect camera on the rotating platform, and simultaneously collect the color images and depth map information of the 3D human hand under 35 viewing angles by rotating the rotating platform. The initial viewing angle is recorded as 0°, and the rotating platform records data every 10° until Rotate to a 350° viewing angle, and obtain the 3D point cloud data of the 3D human hand under 35 viewing angles from the depth map information;

[0032] 2) Use the improved ICP registration algorithm to obtain the initial 3D point cloud data of the 3D hand from 35 viewing angles;

[0033] 3) Denoising, repairing and streamlining the initial 3D point cloud data of 3D hands to obtain complete 3D point cloud data of 3D hands;

[0034] 4) According to the shape characteristics of the human hand, the complete 3D point cloud data of the 3D human hand is div...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of a three-dimensional gesture interaction method, and specifically relates to a gesture interaction method based on joint point transformation. The invention mainly solves the technical problems of expensive devices used and poor authenticity of obtained human hand models in the prior three-dimensional gesture interaction method. According to the invention, three-dimensional point clouds are processed with information in colour images synchronously obtained by a Kinect camera while acquiring point cloud data, in the scheme, a base function construction method based on geodesic distance is adopted in reconstruction, and a curved surface representation function is a high-order derivable parametric form and can guarantee topological property and surface smoothness of the human hand; meanwhile, joint points can be effectively extracted by a method of inter-dimensional matching between the three-dimensional point clouds and a two-dimensional image. The method in the scheme has the characteristics of more accurate, fast extraction speed and low computation load relative to the conventional method of reconstructing the three-dimensional joint points after performing matching between the images.

Description

technical field [0001] The invention belongs to the technical field of three-dimensional gesture interaction methods, and in particular relates to a gesture interaction method based on joint point transformation. Background technique [0002] In the initial development stage of virtual reality technology, the operating tools for human-computer interaction are only some simple external devices such as keyboards, mice, and joysticks. The user uses both hands to realize the purpose of casual communication between the virtual world and the real world. With the rapid development of virtual reality technology, simple interactive tools such as mouse and keyboard and traditional interactive interface can no longer meet the requirements of advanced human-computer interaction. And born. As a means of communication second only to language, human hands play an indispensable role. In an environment where language transmission is impossible, people can express and understand each other...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/0487G06F3/01
Inventor 况立群魏元韩燮于雅慧
Owner ZHONGBEI UNIV