Unlock instant, AI-driven research and patent intelligence for your innovation.

A Gesture Interaction Method Based on Joint Transformation

A joint point and gesture technology, applied in the field of three-dimensional gesture interaction, can solve the problems of expensive equipment and poor authenticity of human hand models, and achieve the effect of less calculation, considerable recognition effect and simple method

Active Publication Date: 2018-01-09
ZHONGBEI UNIV
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the technical problems of the existing three-dimensional gesture interaction method that the equipment used is expensive and the authenticity of the obtained human hand model is poor, and to provide a gesture interaction method based on joint point transformation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Gesture Interaction Method Based on Joint Transformation
  • A Gesture Interaction Method Based on Joint Transformation
  • A Gesture Interaction Method Based on Joint Transformation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] A gesture interaction method based on joint point transformation in this embodiment includes the following steps:

[0031] 1) Place the Kinect camera on the rotating platform, and simultaneously collect the color images and depth map information of the 3D human hand under 35 viewing angles by rotating the rotating platform. The initial viewing angle is recorded as 0°, and the rotating platform records data every 10° until Rotate to a 350° viewing angle, and obtain the 3D point cloud data of the 3D human hand under 35 viewing angles from the depth map information;

[0032] 2) Use the improved ICP registration algorithm to obtain the initial 3D point cloud data of the 3D hand from 35 viewing angles;

[0033] 3) Denoising, repairing and streamlining the initial 3D point cloud data of 3D hands to obtain complete 3D point cloud data of 3D hands;

[0034] 4) According to the shape characteristics of the human hand, the complete 3D point cloud data of the 3D human hand is div...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of three-dimensional gesture interaction methods, and in particular relates to a gesture interaction method based on joint point transformation. The invention mainly solves the technical problems of the existing three-dimensional gesture interaction method that the equipment used is expensive and the obtained human hand model is poor in authenticity. The present invention uses the information in the color image synchronously obtained by the Kinect camera when collecting point cloud data to process the three-dimensional point cloud. In this scheme, the basis function construction method based on the geodesic distance is used for reconstruction, and the surface representation function is a high-order The derivable parameter form can ensure the topological properties of the human hand and the smoothness of the surface; at the same time, the cross-dimensional matching method between the 3D point cloud and the 2D image can effectively extract the joint points. Compared with the traditional image and A method for reconstructing 3D joint points after matching between images. The method in this scheme has the characteristics of more accuracy, faster extraction speed and less calculation.

Description

technical field [0001] The invention belongs to the technical field of three-dimensional gesture interaction methods, and in particular relates to a gesture interaction method based on joint point transformation. Background technique [0002] In the initial development stage of virtual reality technology, the operating tools for human-computer interaction are only some simple external devices such as keyboards, mice, and joysticks. The user uses both hands to realize the purpose of casual communication between the virtual world and the real world. With the rapid development of virtual reality technology, simple interactive tools such as mouse and keyboard and traditional interactive interface can no longer meet the requirements of advanced human-computer interaction. And born. As a means of communication second only to language, human hands play an indispensable role. In an environment where language transmission is impossible, people can express and understand each other...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/0487G06F3/01
Inventor 况立群魏元韩燮于雅慧
Owner ZHONGBEI UNIV