AR implementation method and system based on positioning of visual angle of observer

An implementation method and system technology, applied in the AR field, can solve the problems of limited depth recognition, limited application range, and limited rendering ability, and achieve the effect of unlimited size and complexity, wide application range, and high rendering ability

Inactive Publication Date: 2017-05-24
SHENZHEN DLP DIGITAL TECH
View PDF5 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The above two methods have the following disadvantages: first, the disadvantage of the first method is that the range of depth recognition is limited, and the effect is best within the range of 4-5 meters, and it is basically impossible to recognize beyond 10 meters; the second is that the microprocessor is integrated into the In the AR glasses worn by the observer, the rendering ability is limited, resulting in insufficient fineness and complexity of the scene, which cannot support various dynamic elements; the disadvantage of the second method is that there is no 3D position information of the real scene, and it is not for the 3D real scene. Augmented reality, only for augmented reality of three-dimensional objects, the scope of application is very limited

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • AR implementation method and system based on positioning of visual angle of observer
  • AR implementation method and system based on positioning of visual angle of observer
  • AR implementation method and system based on positioning of visual angle of observer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] In order to fully understand the technical content of the present invention, the technical solutions of the present invention will be further introduced and illustrated below in conjunction with specific examples, but not limited thereto.

[0044] Such as Figure 1-6 In the specific embodiment shown, the AR implementation method based on the observer's perspective positioning provided by this embodiment can be used in any scene, realizing unlimited size and complexity of the real scene, high rendering capability, strong applicability, and Has a wide range of applications.

[0045] Such as figure 1 As shown, the AR implementation method based on the observer's perspective positioning includes:

[0046] S1. Obtain the position and posture of the observer in the real scene, and assign it to the virtual camera;

[0047] S2. Send the position and the posture to the rendering center for rendering, and obtain a three-dimensional virtual scene;

[0048] S3. Send the three-d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an AR implementation method and system based on the positioning of a visual angle of an observer, and the method comprises the steps: obtaining the position and posture of the observer located in a real scene, and giving the position and posture to a virtual camera; transmitting and the position and posture to a rendering center for rendering, and obtaining a three-dimensional virtual scene; transmitting the three-dimensional virtual scene to display equipment worn by the observer, carrying out the superposing of the three-dimensional virtual scene and the real scene, and displaying the superposed scene; and carrying out the matching of a real object in the real scene and a virtual object in the three-dimensional virtual scene. The method achieves the fusion and display of the real world and a virtual world, employs the rendering center in a server for rendering, enables the rendering process to be integrated in the server, achieves no constraint on the size of the real scene and the complexity, is strong in rendering capability, is high in applicability, and is wide in application range.

Description

technical field [0001] The present invention relates to the technical field of AR, and more specifically refers to an AR implementation method and system based on observer perspective positioning. Background technique [0002] VR (Virtual Reality) and AR (Augmented Reality), as new display methods in the future, have become popular all over the world in recent years, and many companies are developing various terminal displays or providing content services around VR and AR technologies. Among them, AR technology not only needs to perfectly superimpose the images of the virtual scene and the real scene, but also needs to ensure that with the movement of the observer's position and perspective, the virtual scene must keep moving synchronously with the real scene, so it has high technical difficulty. [0003] At present, AR technology mainly has the following two implementation methods: [0004] The first way is to conduct pre-3D scanning and position extraction of real scenes ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06F3/14
CPCG06F3/14G06T19/006
Inventor 刘林运温晓晴丁淑华田媛
Owner SHENZHEN DLP DIGITAL TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products