Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic gesture interaction method based on monocular camera

A monocular camera and dynamic gesture technology, applied in the field of human-computer interaction, can solve the problems of low precision, poor effect, difficulty, etc., to achieve the effect of ensuring performance accuracy, realizing interactive recognition, and ensuring the level of accuracy

Pending Publication Date: 2019-08-16
珠海华园信息技术有限公司
View PDF6 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the detection of palms relies on traditional manual features, and the effect is difficult to guarantee, and the subsequent time recurrent neural network (LSTM) has a complex structure and is difficult to train, which brings difficulties to practical applications.
[0006] Therefore, the dynamic gesture interaction technology in the prior art has the technical problems of low precision and poor effect
[0007] For the above problems, no effective solution has been proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic gesture interaction method based on monocular camera
  • Dynamic gesture interaction method based on monocular camera
  • Dynamic gesture interaction method based on monocular camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0032] see Figure 1-4 , the present invention provides a technical solution: a dynamic gesture interaction method based on a monocular camera, the steps include:

[0033] Step S102, obtaining multiple video sequences containing different gestures, performing normalization and preprocessing, and adding labels to form a gesture interaction data set;

[0034] Step S104, obtaining multiple deep neural networks with the same structure to form multiple spatial fea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic gesture interaction method based on a monocular camera, and the method comprises the steps: obtaining a plurality of video sequences containing different gesture actions, carrying out the normalization and preprocessing, and adding a label, so as to form a gesture interaction data set; obtaining a plurality of deep neural networks with the same structure to form aplurality of spatial feature extraction sub-branches; training a plurality of spatial feature extraction sub-branches based on the gesture interaction data set until convergence; obtaining a single deep neural network to form a time feature extraction total branch, and cascading the time feature extraction total branch with a plurality of spatial feature extraction sub-branches to obtain a gesture interaction model; adopting the gesture interaction data set to train the gesture recognition total model until convergence; and inputting a to-be-identified gesture video sequence into the converged gesture identification total model to obtain a judgment result. According to the invention, the difficulty of video sequence processing is reduced, and the precision level is ensured. The performance precision of the spatial feature extraction sub-branches is effectively ensured. Interactive identification of dynamic gestures can be realized.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, in particular to a dynamic gesture interaction method based on a monocular camera. Background technique [0002] With the development of science and technology, human-computer interaction has become a very important means of communication. Common human-computer interaction modes such as touch interaction, keyboard interaction, mouse control interaction, voice interaction, etc. Advances in computer vision are making gesture recognition a new mode of interaction. Compared with other methods, gesture recognition is more natural, simple and convenient, and can be used in applications such as smart home control and somatosensory games. Among them, the gesture recognition technology based on a monocular camera can be applied to ordinary color cameras, and the cost is low. Compared with expensive depth cameras, it is more suitable for large-scale and extensive deployment. [0003] ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06N3/04
CPCG06F3/017G06V40/28G06N3/045
Inventor 刘若泉马佳丽
Owner 珠海华园信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products