Hand positioning method and device in three-dimensional space, and intelligent equipment

A three-dimensional space and hand technology, applied in image data processing, instruments, calculations, etc., can solve problems such as poor robustness, susceptibility to environmental interference, and large amount of calculations

Inactive Publication Date: 2017-06-13
BEIJING XINGYUANYONG NETWORK TECH
View PDF10 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the process of implementing the present invention, the inventors found that the existing methods for locating the position of the hand in three-dimensional space based on ordinary 2D images have a large amount of calculation, poor robustness, and ar...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand positioning method and device in three-dimensional space, and intelligent equipment
  • Hand positioning method and device in three-dimensional space, and intelligent equipment
  • Hand positioning method and device in three-dimensional space, and intelligent equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0049] Determine the target area containing the hand in the depth image containing the hand. The target area can be slightly larger than the size of the hand. The following two implementation methods can be used for specific implementation. Specifically:

Embodiment approach 1

[0050] Embodiment 1. Under the condition that the multi-frame depth images collected before the current frame all contain hands, according to the movement track of the target area containing the hand in the multi-frame depth images collected before the current frame, determine the depth image of the current frame. Target region containing the hand.

[0051] More preferably, the multi-frame depth images collected before the current frame mentioned in this embodiment are collected continuously with the current frame depth image, that is, the multi-frame depth images collected before the current frame mentioned in this embodiment are continuous A multi-frame depth image is collected, and the last frame of the multi-frame depth image is the previous frame of the current frame.

[0052] As a more specific embodiment, the two frames of depth images collected before the current frame are depth image A and depth image B. Under the condition that both depth image A and depth image B co...

Embodiment approach 2

[0056] Embodiment 2: Hands are not included in the depth image of the previous frame of the current frame or in the depth image of the previous frame of the current frame only in the depth image of the previous frame of the multi-frame depth image collected before the current frame or in the depth image of the current frame Under the condition of a new hand, based on the pre-trained hand detection model and the current frame depth image, determine the target area containing the hand in the current frame depth image.

[0057] During specific implementation, under the condition that the previous frame depth image does not contain hands, the current frame depth image may contain hands. Therefore, based on the pre-trained hand detection model, determine whether the current frame depth image contains hands. part, under the condition that the current frame depth image contains the hand, determine the target area containing the hand in the current frame depth image.

[0058] Since th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a hand positioning method and device in a three-dimensional space, and intelligent equipment, and is used for lowering a calculated amount for hand positioning in the three-dimensional space, improving hand positioning robustness and reducing environment interference for the hand positioning. The hand positioning method in the three-dimensional space comprises the following steps that: collecting a depth image which contains a hand; intercepting a target depth image which only contains the hand from the depth image, and on the basis of a pre-trained hand articulation point model and the target depth image, determining the three-dimensional space coordinate, which is relative to a camera used for collecting the depth image, of each articulation point in the target depth image; and according to the pre-determined pose data of the camera in the space and the three-dimensional space coordinate, which is relative to a camera used for collecting the depth image, of each articulation point in the target depth image, determining the three-dimensional space coordinate of each articulation point of the hand in the space in the target depth image.

Description

technical field [0001] The present invention relates to the technical field of smart devices, in particular to a method, device and smart device for hand positioning in three-dimensional space. Background technique [0002] With the development of virtual reality technology (Virtual Reality, VR) and augmented reality technology (Augmented Reality, AR), VR and AR are gradually known by the public. VR devices and AR devices are expected to become the next generation of human-computer interaction interfaces, but on the input side, that is, how users issue instructions and operate in a virtual environment, there are still many bottlenecks, such as: positioning the position of the hand in three-dimensional space . [0003] In the prior art, the mainstream method for locating the position of the hand in three-dimensional space is based on ordinary 2D images. Extract the two-dimensional hand skeleton from the 2D image (the skeleton is composed of joint points and connecting lines...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/70
CPCG06T2207/10028
Inventor 孙铮
Owner BEIJING XINGYUANYONG NETWORK TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products