Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion capture system and method based on laser large-space positioning and optical inertia complementation

A technology of motion capture and large space, which is applied in the direction of optical devices, user/computer interaction input/output, instruments, etc., can solve the problems of easy occlusion of optical positioning, error accumulation of inertial motion capture technology, and motion loss, etc., to achieve Resolved foot drift, reduced number of cameras, reduced likelihood effects

Active Publication Date: 2021-01-22
THE 28TH RES INST OF CHINA ELECTRONICS TECH GROUP CORP
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Through the complementary advantages of optical and inertial technology, the problem of motion loss caused by occlusion of optical positioning and accumulation of errors in inertial motion capture technology is solved, and a low-cost, anti-occlusion, anti-interference, high-precision large-space motion capture solution is realized.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion capture system and method based on laser large-space positioning and optical inertia complementation
  • Motion capture system and method based on laser large-space positioning and optical inertia complementation
  • Motion capture system and method based on laser large-space positioning and optical inertia complementation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0054] The embodiment of the present invention discloses a motion capture method based on laser large-space positioning and light-inertia complementarity. This method is applied to users in military and civilian fields such as troops, armed police, and public security, and provides virtual immersive simulations for application scenarios such as daily training and emergency drills. The training solution can support the trainees to enter a highly simulated virtual training environment, and realize the synchronous mapping of the virtual person to the real person's action through the motion capture system, so as to complete the cooperation and confrontation between the real person and the real person, and the real person and the virtual...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

PropertyMeasurementUnit
Installation heightaaaaaaaaaa
Login to View More

Abstract

The invention provides a motion capture system and method based on laser large-space positioning and optical inertia complementation. The system comprises a positioning base station, a positioning piece and a processing unit. The method comprises the steps of: constructing a local area network through the Bluetooth, the WiFi and the server; fixing the positioning base stations to the four cornersof the positioning space to scan the positioning space, and matching the positioning piece to collect optical positioning data of a captured object, wherein the positioning piece is arranged on the body of the captured object, collects optical positioning data and inertial motion capture data of the captured object, transmits the optical positioning data of the captured object through Bluetooth, and transmits the inertial motion capture data of the captured object through WiFi; and the processing unit receives the optical positioning data and the inertial motion capture data of the capture object acquired and transmitted by the positioning piece in real time through Bluetooth and WiFi, and fuses and solves the pose information of the capture object. The system realizes large-space multi-person positioning and motion capture based on laser large-space positioning, optical and inertial complementary motion capture, IK whole-body attitude calculation and other technologies.

Description

technical field [0001] The present invention relates to the field of spatial positioning and motion capture, in particular to a motion capture system and method based on laser large-space positioning and optical inertial complementarity. Background technique [0002] With the rapid development of virtual reality, human-computer interaction, network communication and other technologies, in order to meet the needs of the military and civilian fields, the combination of virtual reality and reality is actively promoted, and the exploration of safe, efficient, and realistic training solutions is accelerated, and efforts are made to create realistic environments, strong sense of immersion, and easy operation. The key to the intuitive immersive virtual simulation training system is to realize the precise positioning and motion capture technology of multiple people in a large space. [0003] The existing mainstream spatial positioning and motion capture technologies include two cate...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06T7/246G01B11/00
CPCG06F3/011G06T7/246G01B11/00G01B11/002G06F2203/012G06F2203/011
Inventor 黄婧雷斌周传龙徐伟杨光陈伟伟
Owner THE 28TH RES INST OF CHINA ELECTRONICS TECH GROUP CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products