A method for selecting a freehand gesture motion expression based on a three-dimensional object contour

A gesture action and three-dimensional object technology, applied in the field of human-computer interaction, can solve problems such as the difficulty in meeting the requirements of natural and efficient selection of three-dimensional objects, the large computational load of grasping interaction technology, and the challenging research and development of software algorithms, etc., to achieve high efficiency Human-computer interaction with natural 3D objects, low computational load, and high-efficiency human-computer interaction process

Active Publication Date: 2019-01-25
ZHEJIANG UNIV
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The advantage of the grasping interaction technology is that it is natural and intuitive, but the disadvantage is that the grasping interaction technology can only select objects close to the operator, and the calculation load of the grasping interaction technology is large, and the research and development of software algorithms is very challenging, because this technology involves computing A large amount of real-time collision detection calculations and the satisfaction of complex human grasping rules
Therefore, both of them are difficult to meet the requirements of natural and efficient selection of 3D objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for selecting a freehand gesture motion expression based on a three-dimensional object contour
  • A method for selecting a freehand gesture motion expression based on a three-dimensional object contour
  • A method for selecting a freehand gesture motion expression based on a three-dimensional object contour

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be further described in detail according to the accompanying drawings and specific embodiments of the specification.

[0046] In this embodiment, the sharp movement gesture capture sensor is used to capture gesture action signals and record the gesture action data of the subjects during the experiment. In order to meet the experimental requirements of the gesture operation space, the sharp gesture capture sensor is placed about 35cm below the starting position of the gesture of the subject to ensure that the effective gesture operation space of the subject is within about 2.5cm to 60cm above the sensor. The capture accuracy of the capture sensor in this effective gesture operation space is 1.2mm. The EPSON CB-X04 projector is used to project the picture on the screen. The resolution of the whole picture is 1024px×768px, and the size projected on the projection screen is 124cm×87cm. The vertical distance between the center of the projection scree...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a selection method of freehand gesture motion expression based on three-dimensional object contour, which comprises the following steps: (1) in a preprocessing stage, the contour of each alternative object located in an operator's visual field scene is calculated and generated in real time; (2) free hand gesture inputting, the operator inputs the free hand gesture to the contour of the desired three-dimensional object, and calculates and draws the real-time gesture trajectory graph; (3) contour matching, and real-time contour matching calculation between contour and gesture trajectory of the candidate object generated in the preprocessing stage; (4)the operator confirms that the operator selects and confirms the matching candidate objects by gestures. The inventioncan effectively solve the problem of low identification accuracy caused by small object volume and serious occlusion, and realize high-efficient natural three-dimensional object human-computer interaction which is more in line with user interaction habit and operation characteristics.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and in particular relates to a selection method based on three-dimensional object contour freehand gesture action expression. Background technique [0002] Object selection technology is one of the core technologies in human-computer interaction. In human-computer interaction, before operating an object, it needs to be selected first. With the rapid development of virtual reality technology and augmented reality technology, the selection technology of three-dimensional objects in virtual environment has become a research hotspot in three-dimensional human-computer interaction due to its wide application value. [0003] At present, the existing 3D object selection technology is roughly divided into two categories: pointing interaction technology and grasping interaction technology. The pointing interaction technology is realized through the "ray-casting" process, in which the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/017G06V40/113G06V40/28
Inventor 万华根韩晓霞李沫陶李嘉栋
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products