Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual reality driving method and virtual reality system based on arm motion capture

A motion capture and virtual reality technology, applied in the field of virtual reality systems, can solve the problems of large interference from the external environment, inability to track accurately for a long time, and poor arm movement effect, so as to improve accuracy and maintain consistent spatial positions. Effect

Active Publication Date: 2018-11-06
SHENZHEN UNIV
View PDF10 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the effect of capturing arm movements in the above-mentioned way is not good. For example, the method based on computer vision is easily disturbed by the external environment, such as lighting conditions, backgrounds, and occluders; the method based on inertial sensors is affected by measurement noise and wandering. Influenced by factors such as errors, it is impossible to track accurately for a long time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reality driving method and virtual reality system based on arm motion capture
  • Virtual reality driving method and virtual reality system based on arm motion capture
  • Virtual reality driving method and virtual reality system based on arm motion capture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention provides a virtual reality driving method and virtual reality system based on arm motion capture. In order to make the purpose, technical solution and effect of the present invention clearer and clearer, the present invention will be further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0041] Those skilled in the art will understand that unless otherwise stated, the singular forms "a", "an", "said" and "the" used herein may also include plural forms. It should be further understood that the word "comprising" used in the description of the present invention refers to the presence of said features, integers, steps, operations, elements and / or components, but does not exclude the presence or addition of one or more other features, Integers, steps, operations, e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a virtual reality driving method and a virtual reality system based on arm motion capture. The method comprises: when a human body wears a motion capturing system, initializing a preset posture to obtain initial posture data; capturing real-time posture data of the human body, and according to the real-time posture data and the initial posture data, determining a first arm posture by using a transformation matrix method between links of the arm; and according to a preset built-in model, converting the first arm posture into a second arm posture of a preset virtual character, and driving a preset virtual character according to the second arm posture. By using the technical scheme of the present application, according to the obtained initial posture data andreal-time posture data, the arm posture data can be determined by using an arm motion link structure in the form of a transformation matrix between the links, and the accuracy of the arm motion recognition can be determined; and based on the built-in model, the arm posture data is converted to drive the motion of a 3D virtual character, so that the arm spatial position of the virtual character isconsistent with the spatial position of the real character.

Description

technical field [0001] The invention relates to the technical field of intelligent terminals, in particular to a virtual reality driving method and a virtual reality system based on arm motion capture. Background technique [0002] Virtual reality (VR) is a new technology that "seamlessly" integrates real world information and virtual world information. Through cutting-edge technologies such as computers, it combines reality and illusion that could not be experienced in the real world. After the simulation is superimposed, the imaginary characters or objects are superimposed on the real world, which is perceived by human visual senses, so as to achieve an experience beyond reality. In this way, the real environment and illusory objects can be superimposed into the same space in real time. Existing virtual reality is generally based on the motion capture system to recognize human body movements, and control virtual reality characters according to human body movements, especi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/011
Inventor 蔡树彬温锦纯明仲
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products