Check patentability & draft patents in minutes with Patsnap Eureka AI!

Spatial positioning method based on mobile terminal application augmented virtual reality technology

A mobile terminal and spatial positioning technology, applied in the input/output of user/computer interaction, the input/output process of data processing, instruments, etc., can solve the problem that the models are not on the same level, and improve efficiency and spatial scope. Smooth operation and good experience

Active Publication Date: 2020-07-31
李斌
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The invention provides a spatial positioning method based on mobile terminal application augmented virtual reality technology to solve the problem that the models are not on the same horizontal plane caused by multiple positioning reference planes in the original spatial positioning augmented reality technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Spatial positioning method based on mobile terminal application augmented virtual reality technology
  • Spatial positioning method based on mobile terminal application augmented virtual reality technology
  • Spatial positioning method based on mobile terminal application augmented virtual reality technology

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0054] like figure 1 As shown, a spatial positioning method based on mobile terminal application augmented virtual reality technology includes the following steps:

[0055] Step S1. calling the camera through the AR application program on the mobile terminal, so that the camera focuses on the reference plane;

[0056] Step S2. Calculate the datum position;

[0057] Step S3. Place the first model on the datum point on the datum plane;

[0058] Step S4. Move the camera to the position where the second model is placed;

[0059] Step S5. The reference plane is extended to generate a virtual plane, and a virtual ray is generated with the camera as the origin;

[0060] Step S6. The intersection of the virtual plane and the virtual ray generates a new reference point;

[0061] Step S7. Place the second model at the new reference point, that is, complete spatial positioning;

[0062] Step S8. When placing the Nth model, repeat steps S4 to S7.

[0063] Wherein, step S1 includes: ...

Embodiment 2

[0086] like figure 2 As shown, the steps are as follows:

[0087] 1. Call the camera through the AR application to focus on the reference plane;

[0088] 2. Calculate the datum position;

[0089] 3. Place the first model on the datum point on the datum plane;

[0090] 4. Move the camera to the approximate position where the second (N) model is placed;

[0091] 5.1. Generate a virtual plane by extending the original datum plane;

[0092] 5.2. Generate a virtual ray with the camera as the origin;

[0093] 6. The intersection of the virtual plane and the virtual ray generates a new reference point;

[0094] 7. Place the second (N) model at the new reference point.

[0095] Among them, after placing the second model, to place the model later, only need to repeat steps 4 to 7.

[0096] like image 3 As shown, the observer uses the mobile terminal to identify the ground vertically downward, and it is easy to obtain the correct distance. At this time, the error is less than ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a spatial positioning method based on a mobile terminal application augmented virtual reality technology, and the method comprises the following steps: S1, calling a camera through an AR application program on a mobile terminal, and enabling the camera to be focused on a reference surface; S2, calculating the position of a reference surface; S3, placing a first model at a datum point on the datum plane; S4, moving the camera to the position where the second model is placed; S5, extending the reference surface to generate a virtual plane, and generating virtual rays by taking the camera as an original point; S6, intersecting the virtual plane with the virtual ray to generate a new reference point; S7, placing a second model at the new datum point, and completing the space positioning. According to the method, the problem that models are not on the same horizontal plane due to multiple positioning datum planes in the original space positioning augmented reality technology is solved.

Description

technical field [0001] The present invention relates to the fields of image pattern recognition, augmented virtual reality, and augmented virtual reality space positioning, and more particularly relates to a space positioning method based on mobile terminal application of augmented virtual reality technology. Background technique [0002] AR augmented virtual technology captures the real physical space through the camera attached to the mobile terminal, places virtual objects or animals (hereinafter referred to as models) in the physical space, and simultaneously presents the model and the real space through the display device. [0003] When placing the model, there are two ways: [0004] The first type: screen positioning [0005] Without spatial active tracking and recognition, the placed model is based on a certain position in the screen captured by the mobile terminal, and can also move along the two-dimensional coordinate system of the screen, which is generally called...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/011G06F2203/012Y02D30/70
Inventor 李斌
Owner 李斌
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More