Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for showing actual object in shared enhanced actual scene in multi-azimuth way

A real object, augmented reality technology, applied in image data processing, 3D modeling, instruments, etc., can solve problems such as expensive instruments, little consideration of error effects, and no need to share information sharing in augmented reality scenes. The effect of reducing computational overhead

Active Publication Date: 2012-06-20
BEIHANG UNIV
View PDF3 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In order to track real objects in real time, the system uses a color calibration object to obtain the position of the real object, thereby calculating the position of the corresponding virtual model in the virtual coordinates, and drawing, the system can make the user in the virtual scene Experience events in the middle, but this method is relatively expensive to use instruments, and is limited to very few users
[0008] By analyzing the research status at home and abroad, it can be concluded that although many organizations or institutions are currently studying the multi-directional representation of real objects in collaborative augmented reality systems, there are still three problems: First: most of the current shared augmented reality Scenes seldom consider how to use fewer cameras to obtain a larger observation range. They all assume that the set video sequence has been able to fully obtain the required real environment information, and does not describe the setting method of multiple video sequences; secondly , after setting up multi-video sequences, existing institutions seldom conduct research on registration information complementation. Even if this issue is considered, they seldom consider the impact of errors on complementary results, and error analysis is not fed back into complementary calculations; finally , in the representation of real objects for collaborative work tasks, many methods model the entire environment, which is computationally intensive, and does not take into account the need for information sharing in shared augmented reality scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for showing actual object in shared enhanced actual scene in multi-azimuth way
  • Method for showing actual object in shared enhanced actual scene in multi-azimuth way
  • Method for showing actual object in shared enhanced actual scene in multi-azimuth way

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be further described below in conjunction with the accompanying drawings, so that those of ordinary skill in the art can implement it after referring to this specification.

[0035] Such as figure 1 As shown, a real object multi-directional representation method of a shared augmented reality scene of the present invention comprises the following steps:

[0036]Step 1. The multi-video sequence set point calculation module abstracts the shared augmented reality scene area into a plane polygon P, and uses the scan line algorithm to divide the plane polygon into a plurality of triangles with no overlapping areas between each other. First, the polygon is divided into Multiple monotone polygons, and then divide each monotone polygon into multiple triangles. In the process of decomposing polygons into triangles, lines will be generated to connect different vertices of polygons. These lines are completely inside the polygon. Such lines are It is call...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for showing an actual object in a shared enhanced actual scene in a multi-azimuth way. The method for showing the actual object in the shared enhanced actual scene in the multi-azimuth way comprises the following steps of: according to characteristics of vertex distribution of a scene area, and according to the coverage area of video sequences to the scene and the number of required apparatuses, optimizing setting ways of multiple video sequences; determining an observation area of each video sequence; matching and determining a shared area between the video sequences according to a characteristic that a standard marker and the dimensions of the video sequences in different azimuths are unchanged; computing a spatial position relationship of all video sequences; finishing false or true registering of each video sequence by using the standard marker; integrating the local descriptions of the video sequences on the actual object, and estimating a three-dimensional convex hull of the actual object according to a computer vision principle; for an entering new cooperative user, giving a rapid registering method according to the spatial position relationship of all video sequences; and drawing a two-dimensional mapping effect of the actual object in the video sequence in each azimuth. The application cost of the method provided by the invention is low, and by the method, a rapid response can be made to the entering new cooperative user.

Description

technical field [0001] The invention relates to the fields of computational geometry, image processing and augmented reality, in particular to a representation method for real objects in multi-directional video sequences sharing augmented reality scenes. Background technique [0002] In collaborative augmented reality, when different collaborative users enter the augmented reality scene, they often have different viewpoints and different interactions in their respective perception and interaction areas. In order to enable these users to jointly complete the predetermined defense tasks, it is necessary Establishing a shared augmented reality scene requires a collaborative augmented reality system capable of depicting multiple orientations and constantly changing real environments, and establishing a 3D space model of the scene based on multiple video sequences. Among them, the multi-directional representation of real objects in the scene is an urgent problem to be solved, inc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T7/00
Inventor 陈小武赵沁平金鑫郭侃侃郭宇
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products