Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A hybrid motion capture system based on deep learning

A motion capture and deep learning technology, applied in the computer field, can solve the problems of good real-time performance, low capture accuracy, easy to be affected by action occlusion, etc., to achieve the effect of ensuring accuracy and accurate data

Pending Publication Date: 2019-01-08
苏州炫感信息科技有限公司
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Inertial motion capture technology uses inertial sensors to collect motion information of each joint of a moving object. The capture accuracy is not high, but it has good real-time performance and can collect occluded motion; optical motion capture uses one or more cameras to capture motion information generated by the same receiving point of a moving object. Parallax calculates its spatial position and rotation information, and the capture accuracy is high, but it is easily affected by motion occlusion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A hybrid motion capture system based on deep learning
  • A hybrid motion capture system based on deep learning
  • A hybrid motion capture system based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described in detail below in conjunction with the embodiments.

[0029] The embodiment of the present invention provides a deep learning-based hybrid motion capture system 100, such as figure 1 As shown, the system 100 includes:

[0030] The inertial motion capture module 110 is used to obtain the inertial posture information of the object A.

[0031] Here, such as figure 2 As shown, the inertial motion capture module 110 includes an inertial measurement unit 210, which is fixed on the object A, and is used to obtain angular velocity information, acceleration information, and magnetic force information of the object A. The inertial measurement unit 210 includes a three-axis gyroscope 211, a three-axis accelerometer 212, and a three-axis magnetometer 213.

[0032] Specifically, the inertial measurement unit 210 includes an attitude sensor, which is a 9-axis sensor, including a three-axis gyroscope 211, a three-axis accelerometer 212, and a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a hybrid motion capture system based on deep learning. The system comprises: an inertial motion capture module, which is used for acquiring inertial attitudeinformation of an object; an optical motion capture module, configured to acquire optical attitude information of the object; a communication module, configured to send the inertial attitude information and the optical attitude information to the analysis module; and an analysis module, which is used for fusing the inertial attitude information and the optical attitude information according to thepreset hybrid model to obtain the optimal motion estimation of the object. In this way, through the characteristics that inertial motion capture is unobstructed and optical motion capture data is accurate, and through deep learning training, a hybrid model which integrates the inertial motion capture and optical motion capture characteristics is obtained, ensuring the accuracy of the final data.

Description

Technical field [0001] The present invention relates to the field of computer technology, in particular to a hybrid motion capture system based on deep learning. Background technique [0002] Traditional computer vision 3D reconstruction has the characteristics of high accuracy, high cost, and high delay, while the motion capture technology used in virtual reality has the characteristics of low cost, low delay and high accuracy. At present, the main domestic motion capture technology can be divided into inertial motion capture technology, optical motion capture technology and other motion capture methods. Inertial motion capture technology uses inertial sensors to collect the motion information of each joint of the moving object. The capture accuracy is not high, but the real-time performance is good, and it can collect occluded motion; optical motion capture uses one or more cameras to capture the moving object at the same receiving point. Parallax calculates its spatial positi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00
CPCG06T19/006
Inventor 路晗
Owner 苏州炫感信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products