Hand posture estimation method and system based on visual and inertial information fusion

An attitude estimation and hand technology, applied in the fields of deep learning, computer vision, and human-computer interaction, can solve the problems that a single sensor is difficult to meet the human-computer interaction work, and it is difficult to meet the complex actual interaction scenarios, so as to overcome the spatial distance and The limitation of natural environment, the improvement of generalization ability, the effect of good real-time performance

Pending Publication Date: 2021-08-06
TIANJIN UNIV
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the traditional single sensor can realize hand pose estimation in a specific scene, it is difficult to meet the complex actual interaction scene.
In order to promote the research and analysis of multi-modal hand pose estimation methods, more and more studies focus on combining visual information and wearable information. Vision-based devices can provide absolute measurement information of hand pose. Wearable-based devices The system can perform data supplementation when the vision-based camera equipment does not capture the hand image and hand occlusion, so as to perform more complex human-computer interaction work. In the prior art, the use of a single sensor for hand pose estimation has a good effect. However, due to the limitation of a single sensor itself, it is difficult to meet the human-computer interaction work in complex application scenarios, so it is of great theoretical research significance and practical application value to explore high-precision, strong immersive real-time gesture pose estimation based on multi-modal information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand posture estimation method and system based on visual and inertial information fusion
  • Hand posture estimation method and system based on visual and inertial information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0046] Such as figure 1 As shown, the specific implementation steps of the hand pose estimation method based on visual and inertial information fusion in this embodiment are as follows:

[0047] 1. Construction of Hand Pose Dataset

[0048] (101) Data collection

[0049] The visual color image collection of the hand pose dataset is completed using the color camera and ToF depth camera equipped with the AR glasses. The envoy wearing the AR glasses can collect hand movement color images and depth images from the first-person perspective.

[0050] Inertial information collection is completed using a simple data glove device. The data glove has 6 built-in inertial measurement units (In...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a hand posture estimation method and system based on visual and inertial information fusion. The method comprises the following steps: (1) constructing a hand posture data set; (2) extracting features: performing visual information feature extraction on a color image acquired by AR glasses through a Resnet50 residual network to finally obtain an image feature vector; performing inertial information feature extraction by constructing a convolutional neural network to obtain an inertial information feature vector; connecting the image feature vector and the inertial feature vector to obtain a fused feature vector; (3) carrying out hand 2D attitude estimation; (4) performing hand 3D posture estimation; (5) carrying out network training and testing; and (6) deploying the trained hand posture estimation network model to AR glasses, and performing real-time hand posture estimation by calling a color camera and a data glove.

Description

technical field [0001] The present invention relates to the fields of human-computer interaction, deep learning, and computer vision, in particular to a hand posture estimation method and system based on fusion of visual and inertial information. Background technique [0002] With the rapid development of the intelligent age, "human-centered" human-computer interaction has attracted widespread attention. Compared with the traditional interaction methods using mouse, keyboard, etc., the emerging human-computer interaction method that uses the natural properties of the human body breaks the restrictions on operating distance, single method, and rule setting, and becomes a more friendly human-computer interaction. direction of interaction research. Gesture is an auxiliary communication method widely used in people's daily life. The structure of the human hand is flexible, the movements are diverse, and it can express rich semantic information, and the gesture expression is nat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/11G06N3/045G06F18/25
Inventor 金杰陈志华周梦伊白佳乐苏倩
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products