Unlock instant, AI-driven research and patent intelligence for your innovation.

3D visual model control method based on gesture recognition

A model control and gesture recognition technology, applied in the field of gesture recognition, can solve the problems of poor overall experience of visitors and unnatural human-computer interaction, and achieve the effect of natural human-computer interaction and good experience

Pending Publication Date: 2022-04-05
CHINA APPLIED TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional method is that managers use the mouse to operate the 3D visualization model, such as: zooming the 3D visualization model, moving the 3D visualization model, rotating the 3D visualization model, etc. When visualizing the model, while explaining through words, gestures are used at the same time. If you then operate the mouse, it will bring more inconvenience. The way of human-computer interaction is not natural enough, and the overall experience of visitors is poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D visual model control method based on gesture recognition
  • 3D visual model control method based on gesture recognition
  • 3D visual model control method based on gesture recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] A preferred embodiment of the present invention will be described in detail below with reference to the accompanying drawings.

[0032] The control method in the present invention is designed and developed based on the python language, and the compilation environment is: PyCharm2021, Python Interpreter 3.7; used libraries and development kits: python-opencv, mediapipe, cv2, autopy.

[0033] The hand includes a thumb 11 , an index finger 12 , a middle finger 13 , a ring finger 14 and a little finger 15 .

[0034] To make a fist is to press the ends of the fingers against the palm of the hand or against other fingers.

[0035] The control method is as follows:

[0036] 1. Configure a camera to obtain the image information of the controller's hand.

[0037] 2. Import the hand key point database and hand key point recognition network model into the server; frame the effective detection area S1 of the camera, gestures within S1 can be effectively recognized, and gestures o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of gesture recognition, and discloses a 3D visual model control method based on gesture recognition, a hand is placed in front of an image acquisition device, a hand image is acquired through the image acquisition device, the effective detection area range of the image acquisition device is S1, when the hand is located in the S1, subsequent steps are carried out, otherwise, the subsequent steps are not carried out; after a hand in the hand image is recognized through the hand key point recognition network model, hand key points T are established, and a hand skeleton graph M is formed according to the hand key points T; wherein each key point T corresponds to a matrix in the hand recognition network model, each matrix is a probability graph of the key point T, the position of the key point T is found by searching the maximum value of the probability graph, and then the gesture and action of the hand are recognized; and then translation, selection, release, rotation and zooming of the 3D visual model are completed through a single finger or double fingers, so that people can use gestures to operate while performing presentation.

Description

technical field [0001] The invention relates to the field of gesture recognition, in particular to a 3D visualization model control method based on gesture recognition. Background technique [0002] With the advancement of science and technology, human-computer interaction has become an important part of people's daily life. The ultimate goal of human-computer interaction is to realize the natural communication between man and machine. [0003] Now more and more management systems will be put on the big screen for display. For example, some management systems will include 3D visualization models. The traditional method is that the manager operates the 3D visualization model through the mouse, such as: zooming the 3D visualization model, moving the 3D visualization model, rotating the 3D visualization model, etc. When visualizing the model, while explaining through words, gestures are used at the same time. If you then operate the mouse, it will bring more inconvenience. Th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F3/04883G06V40/10
Inventor 江大白胡增杨坤龙
Owner CHINA APPLIED TECH CO LTD