Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real object interaction method and system for real-time fusion of virtual and real objects

An interaction method and technology for virtual objects, which are applied in the field of physical interaction methods and systems for real-time fusion of virtual and real objects, and can solve problems such as complex construction processes.

Active Publication Date: 2020-11-06
SHANDONG UNIV
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The complexity of the template extraction process limits the types of object recognition; based on the learning method, it is necessary to establish a data set for each object to be recognized. The data set contains image data or point cloud models, etc., and the construction process is complicated.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real object interaction method and system for real-time fusion of virtual and real objects
  • Real object interaction method and system for real-time fusion of virtual and real objects
  • Real object interaction method and system for real-time fusion of virtual and real objects

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present disclosure will be further described below in conjunction with the accompanying drawings and embodiments.

[0056] It should be noted that the following detailed descriptions are exemplary and intended to provide further explanation of the present disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.

[0057] It should be noted that the terminology used herein is only for describing specific embodiments, and is not intended to limit the exemplary embodiments according to the present disclosure. As used herein, unless the context clearly indicates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprise" and / or "comprise" are used in this specification, they mean There are features, steps, operations, means, components and / or combinations thereof.

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a real object interaction method and system for real-time fusion of virtual and real objects, and the method comprises the steps of carrying out the calculation of a mapping relation between virtual reality equipment coordinate systems of a server and a client, including a rotation matrix and a displacement vector, so as to achieve the unification of the positions of the objects in coordinate representation; acquiring a template of a target object, modeling template features, performing position tracking on a desktop object, extracting an object contour, acquiring a maindirection of the object, acquiring two fixed points in the main direction, calculating a rotation matrix of the object relative to an initial state through singular value decomposition, and calculating a rotation angle through the rotation matrix; obtaining the displacement of the real object from the server side relative to the initial position and the rotation of the real object relative to theinitial orientation, adjusting the position, calculating to obtain the current position and orientation of the virtual object, and displaying the virtual object on the virtual reality equipment of the client side according to the obtained position and orientation. The invention has the advantages of natural interaction, convenient material taking, real-time performance and the like.

Description

technical field [0001] The disclosure belongs to the field of mixed reality technology, and relates to a physical object interaction method and system for real-time fusion of virtual and real objects. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] Mixed reality (MR) is the merging of the real world with the virtual world to produce new environments and visualizations. Physical and digital objects can co-exist in real time in a mixed reality environment. With the maturity of mixed reality technology, traditional interaction methods are not suitable for interactive scenarios where virtual reality is integrated. Object interaction, as an emerging 3D interaction method, is considered as a natural and easy-to-learn paradigm that people can use. Object interaction is an interactive way of interacting with digital information through physica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06T19/00
CPCG06F3/011G06F3/017G06T19/006G06F2203/012
Inventor 杨承磊宋英洁盖伟刘娟卞玉龙
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products