Gesture observation likelihood modeling method for three-dimensional tracking

A modeling method and three-dimensional tracking technology, applied in the field of gesture observation likelihood modeling of three-dimensional tracking, to achieve the effect of improving efficiency and accuracy

Pending Publication Date: 2020-06-05
紫光云技术有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the development and popularization of artificial intelligence technology, the modeling and recognition of gesture gestures are more and more applied in human emotion recognition and intelligent traffic control. To get better understanding and recognition results, it is necessary to establish An efficient gesture model, especially the state tracking of gestures is the most widely used in gesture understanding and recognition, but the currently used gesture modeling method still follows the principle of gradient descent in the later specific recognition, so it cannot be avoided Falling into the embarrassment of local minima, in order to solve this problem, the present invention proposes a gesture modeling based on a three-dimensional tracking gesture observation likelihood modeling method, so the gesture observation relief model oriented to three-dimensional tracking of the present invention is also in this background came into being

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture observation likelihood modeling method for three-dimensional tracking
  • Gesture observation likelihood modeling method for three-dimensional tracking
  • Gesture observation likelihood modeling method for three-dimensional tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] It should be noted that, in the case of no conflict, the embodiments of the present invention and the features in the embodiments can be combined with each other.

[0027] In describing the present invention, it is to be understood that the terms "center", "longitudinal", "transverse", "upper", "lower", "front", "rear", "left", "right", " The orientations or positional relationships indicated by "vertical", "horizontal", "top", "bottom", "inner" and "outer" are based on the orientations or positional relationships shown in the drawings, and are only for the convenience of describing the present invention and Simplified descriptions, rather than indicating or implying that the device or element referred to must have a particular orientation, be constructed and operate in a particular orientation, and thus should not be construed as limiting the invention. In addition, the terms "first", "second", etc. are used for descriptive purposes only, and cannot be understood a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a gesture observation likelihood modeling method for three-dimensional tracking. Firstly, a gesture state model and ormation of the scene where the gesture is located are defined as a similarity measurement likelihood model; the gesture state model adopts a classic three-dimensional gesture modeling method; the information of the scene where the gesture is located mainly relates to depth information similarity measurement extraction of a three-dimensional scene; and then, a gesture observation likelihood modeling method is created according to the foreground informationof the classic three-dimensional gesture modeling method and the high-dimensional well depth information matched with the Chamfer distance, and a three-dimensional tracked gesture observation likelihood model is obtained according to the gesture contour and the classic three-dimensional gesture modeling result.

Description

technical field [0001] The invention belongs to the field of gesture estimation, in particular to a three-dimensional tracking gesture observation likelihood modeling method. Background technique [0002] With the development and popularization of artificial intelligence technology, the modeling and recognition of gesture gestures are more and more applied in human emotion recognition and intelligent traffic control. To get better understanding and recognition results, it is necessary to establish An efficient gesture model, especially the state tracking of gestures is the most widely used in gesture understanding and recognition, but the currently used gesture modeling method still follows the principle of gradient descent in the later specific recognition, so it cannot be avoided Falling into the embarrassment of local minima, in order to solve this problem, the present invention proposes a gesture modeling based on a three-dimensional tracking gesture observation likeliho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/13G06T7/136G06T17/00
CPCG06T17/00G06T7/13G06T7/136G06V40/107
Inventor 周智
Owner 紫光云技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products