Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for maintaining spatial consistency for multi-person augmented reality interaction

An augmented reality, consistent technology, applied in the field of computer vision, it can solve the problems of tracking range and distance limitation, increased calculation amount, complicated circuit, etc., and achieve the effect of maintaining spatial consistency and low calculation amount.

Active Publication Date: 2018-12-18
NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] These two methods have their own advantages and disadvantages: the "outside-in" solution has a simpler structure, and only needs to place infrared point light sources on the surface of the head-mounted display. However, there are still many problems in actual use, such as tracking range and distance. Larger limit
And when the number of headsets to be tracked increases, the amount of calculation will also increase dramatically; the "inside-out" solution is essentially a distributed system, and each sensor is responsible for calculating its own position, so the system has no upper limit on the number of tracks , the disadvantage is that the circuit is complicated, and the base station that emits laser needs to be calibrated in advance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for maintaining spatial consistency for multi-person augmented reality interaction
  • Method for maintaining spatial consistency for multi-person augmented reality interaction
  • Method for maintaining spatial consistency for multi-person augmented reality interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The specific implementation manners of the present invention will be further described in detail below in conjunction with the drawings and examples.

[0059] The method of the present invention proposes a perspective image splicing method based on spatial triangular patch fitting through patch-based approximate processing of the scene structure to be photographed. First, using the orientation function of the Lighthouse base station to the light sensor and the imaging of the scanning laser surface emitted by the base station by the front camera of the head-mounted display, the azimuth angles of all light sensors on the head-mounted display relative to each base station are obtained, as well as the relative distance between the base station and the head-mounted display. The azimuth angle; then, using the known positional relationship between multiple light sensors in the head-mounted display and the front-facing camera, calculate the relative positional relationship betwe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for maintaining spatial consistency for multi-person augmented reality interaction. Based on the principle of orientation and location in computer vision, the azimuth angles of a plurality of light sensors distributed on the header display with respect to each base station are obtained by using the Lighthouse base station and the azimuth angle of the base station relative to the head display is obtained by using the front camera of the head display, and the coordinates of each base station in the camera coordinate system of the front camera of the head display are calculated by combining the light sensors of the head display and the known positional relationship between the front camera and the front camera. Further, if three base stations are used to determine the model coordinate system of the virtual scene, the transformation relationship between the head display and the model coordinate system can be established, thereby maintaining the spatial consistency of the multi-person augmented reality interaction, enabling multiple head displays to be tracked with low computational load, and eliminating the need for calibration of multiple base stations.

Description

technical field [0001] The invention belongs to the field of computer vision and relates to a method for maintaining spatial consistency, in particular to a method for maintaining spatial consistency oriented to multi-person augmented reality interaction. Background technique [0002] In recent years, the rapid improvement of computer processor capabilities has provided favorable conditions for the rapid development of virtual reality (Virtual Reality, VR) and augmented reality (Augmented Reality, AR). AR display devices such as HTC Vive, Oculus Rift, Microsoft HoloLens, etc. Multi-person augmented reality interaction is one of the main uses of new AR devices. During the interaction process, the augmented reality scenes observed by all users should be consistent, that is, to maintain spatial consistency. Specifically, it is necessary to acquire the position and attitude (referred to as pose) of a head-mounted display (head-mounted display) device accurately and in real time...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06T7/60G06T7/73
CPCG06F3/011G06F2203/012G06T7/60G06T7/73
Inventor 邓宝松李靖印二威鹿迎项德良闫野
Owner NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products