Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Toolkit for three-dimensional interactive scenes

A three-dimensional interaction and three-dimensional scene technology, applied in the input/output of user/computer interaction, the processing of 3D images, the input/output process of data processing, etc. And other issues

Inactive Publication Date: 2019-01-01
冯仕昌
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In interactive application scenarios such as virtual reality (VR), virtual assembly, 3D manufacturing, etc., the typical methods of interaction currently used are: (1) Laser pointer, the user holds the laser pointer to interact with the environment in the 3D scene. The problem is that the operation is inconvenient; (2) the 3D virtual hand interacts with the scene through the 3D virtual hand in the scene. The main problem of this method is that the gesture function is limited and some special functions cannot be completed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Toolkit for three-dimensional interactive scenes
  • Toolkit for three-dimensional interactive scenes
  • Toolkit for three-dimensional interactive scenes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to clearly illustrate the technical features of the solution of the present invention, the solution will be further elaborated below through specific implementation modes.

[0051] Such as figure 1 As shown in , a toolbox for 3D interactive scenes, including the following steps:

[0052] a. Make a tool set for 3D scene interaction, use software such as OpenGL to generate a tool set for 3D scene interaction;

[0053] b. Gesture recognition, the method of gesture recognition described in this embodiment is a CNN-SVM hybrid model gesture recognition method based on error correction strategy, the method first preprocesses the collected gesture data, and then automatically Extract feature and carry out prediction classification to obtain classification result, utilize error correction strategy to correct described classification result at last, described CNN-SVM hybrid model gesture recognition method based on error correction strategy comprises the following steps: ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a toolbox for three-dimensional interactive scene, which is characterized in that the toolbox comprises the following steps: a. producing a toolset for three-dimensional sceneinteraction, and the toolset for three-dimensional scene interaction is generated by using, for example, OpenGL software; b. Performing gesture recognition; c real-time reconstructing the 3D gesturestructure, reconstructing the 3D gesture model sequence base on the input gesture image sequence by adopting a target tracking algorithm such as particle filter (PF), namely reconstructing the 3D gesture structure, and obtaining all basic information of the position coordinates of each finger joint, the position coordinates of the center of gravity of the palm and the gesture motion in the 3D gesture model; D. transforming the 3D gesture model into a selected virtual tool. The invention provides a camera as an interactive input device, and a user gesture is incarnated as a virtual tool, so asto obtain the beneficial effects of an interactive tool and a user experience which are basically consistent with the real operation experience.

Description

technical field [0001] The invention relates to the field of computer technology, in particular to a toolbox for three-dimensional interactive scenes. Background technique [0002] In interactive application scenarios such as virtual reality (VR), virtual assembly, 3D manufacturing, etc., the typical methods of interaction currently used are: (1) Laser pointer, the user holds the laser pointer to interact with the environment in the 3D scene. The problem is that the operation is inconvenient; (2) the three-dimensional virtual hand interacts with the scene through the three-dimensional virtual hand in the scene. The main problem of this method is that the gesture function is limited and some special functions cannot be completed. Contents of the invention [0003] Aiming at the deficiencies of the prior art, the present invention proposes a toolbox for three-dimensional interactive scenes, which can simply and conveniently realize the functions of the tool by using gestures...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06T15/00
CPCG06F3/017G06T15/005
Inventor 冯仕昌
Owner 冯仕昌
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products