System and method for controlling 3D model based on gesture

A gesture control and 3D technology, which is applied in the input/output process of data processing, instruments, electrical digital data processing, etc., can solve the problem of not being able to support translation, rotation, and scaling at the same time, and achieve fast query speed, simple control, and low cost. low effect

Inactive Publication Date: 2017-07-18
太炫科技(南京)有限公司
View PDF9 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] Aiming at the problems existing in the prior art that the distance between two fingers changes to control the scaling of the 3D model; the three commonly used operations of translation, rotation, and scaling cannot be supported at the same time, the present invention provides a system and method for controlling a 3D model based on gestures

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method for controlling 3D model based on gesture
  • System and method for controlling 3D model based on gesture
  • System and method for controlling 3D model based on gesture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] The present invention constructs a new gesture control system, based on a gesture method in which one finger is stationary and the other finger slides, the relative coordinates of the sliding finger are used to control the translation or rotation of the 3D model, such as figure 2 As shown, a system for controlling a 3D model based on gestures, including a screen event catcher, captures finger events on the touch screen in real time, and obtains the number of fingers on the screen in real time, and the coordinates of the corresponding fingers on the screen; The captured finger information is passed to the gesture processor; the so-called finger event is the event that the finger operates on the touch screen, and this event includes but not limited to events such as finger clicking on the screen, finger sliding on the screen, and finger leaving the screen;

[0047] The gesture processor controls the specific change vector of the 3D model according to the number of fingers...

Embodiment 2

[0053] Such as figure 1 , 3 As shown, a method for controlling a 3D model based on gestures, the steps are as follows:

[0054] Step 1. The screen event catcher acquires event elements from the touch screen; the event elements include the number of fingers, corresponding finger events, and screen coordinates of corresponding fingers.

[0055] Step 2, the gesture processor detects the number of fingers, and judges whether the finger index touched on the screen is equal to 2 according to the number of fingers, if not, then end the process; if yes, continue to process downward;

[0056] Step 3. The gesture processor detects gestures, and judges two of the fingers from the finger events and the screen coordinates of the corresponding fingers. If both fingers are moving, the process ends; if both fingers are stationary, return Go to step 1; if one finger is stationary and the other is moving, continue processing downwards; the judgment of finger movement is determined by the shak...

Embodiment 3

[0062] The 3D model control system also includes gestures of one finger moving and one finger sliding to control the translation of the 3D model, gestures of one finger moving to control the rotation of the 3D model, and gestures of two fingers moving simultaneously to control the scaling of the 3D model. The specific implementation methods are as follows:

[0063] Control command mapper: record the corresponding relationship of the 3D model control command corresponding to the gesture, that is, the 3D model control command corresponding to the gesture of "one finger is still and the other finger slides" is 3D model translation, "there is only one finger and this finger The 3D model control command corresponding to the gesture of "swipe" is 3D model rotation, and the 3D model control command corresponding to the gesture of "sliding with two fingers at the same time" is 3D model scaling;

[0064] Step 1. The screen event catcher acquires event elements from the touch screen, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a system and a method for controlling a 3D model based on a gesture, and belongs to the field of 3D model control. The system for controlling the 3D model based on the gesture comprises a screen event capturer for capturing a finger event on a touch screen in real time, a gesture processor for calculating a change vector according to coordinates obtained from the screen event capturer, a 3D model renderer for controlling display of the 3D model on the screen according to the change vector, and a control instruction mapper for recording a 3D model control instruction corresponding to the gesture. The screen event capturer transmits captured finger information to the gesture processor; the gesture controller transmits the calculated change vector to the 3D model renderer; the 3D model renderer controls the display on the screen; and corresponding operation steps are correspondingly performed, so that the problems that a 3D model operation engine, for the touch screen, mentioned in the prior art is realized at most through two fingers, only supports two operations and cannot support three common operations of translation, rotation and zooming at the same time are solved.

Description

technical field [0001] The present invention relates to the field of 3D model control, and more specifically, to a system and method for controlling a 3D model based on gestures. Background technique [0002] The 3D model in the PC is generally controlled by the mouse, that is, the left button, the right button and the middle scroll wheel of the mouse control the translation, rotation and scaling of the 3D model respectively. With the rapid development of 3D models in touch screen devices, people hope that the 3D models can be operated on the touch screen as conveniently. Since the current 3D model operation engine for touch screens only supports two types: one is to move with one finger on the screen, and the other is to move with two fingers on the screen; You can't do it with just two gestures. Therefore, there are generally two types of combinations for operating the 3D model on the current touch screen: one is to move a single finger on the screen to control the trans...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/0484G06F3/0488
CPCG06F3/04845G06F3/04883
Inventor 王征
Owner 太炫科技(南京)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products