Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Outdoor augmented reality application method based on cross-source image matching

An outdoor augmented reality and application method technology, applied in the field of outdoor augmented reality, can solve the problems of unreality, fusion accuracy is robust to changes in light and occlusion, and affects the effect of augmented reality, and achieves the effect of improving the effect.

Active Publication Date: 2020-06-09
XIAMEN UNIV
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In related technologies, the application of augmented reality is mainly concentrated in indoor scenes, and pre-placed markers are used to assist virtual and real registration. However, in outdoor scenes, due to the increase in scale and complexity of outdoor scenes, it is unrealistic to use pre-placed markers. , so most outdoor augmented reality applications are usually sensor-based positioning and vision methods, and are mainly used in static scenes, and the fusion accuracy of multi-sensors is not robust to light changes and occlusions, thus affecting the effect of augmented reality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Outdoor augmented reality application method based on cross-source image matching
  • Outdoor augmented reality application method based on cross-source image matching
  • Outdoor augmented reality application method based on cross-source image matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The embodiments of the present invention are described in detail below. Examples of the embodiments are shown in the accompanying drawings, in which the same or similar reference numerals indicate the same or similar elements or elements with the same or similar functions. The embodiments described below with reference to the accompanying drawings are exemplary, and are intended to explain the present invention, but should not be construed as limiting the present invention.

[0032] In order to better understand the above technical solutions, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention can be implemented in various forms and should not be limited by the embodiments set forth herein. On the contrary, these embodiments are provided to enable a more thorough ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an outdoor augmented reality application method based on cross-source image matching, and the method comprises the steps: obtaining a camera image and a rendering image correspondingly matched with the camera image, and carrying out the processing of the camera image and the rendering image, so as to obtain a local camera image block and a local rendering image block, whichare matched in pairs; constructing a deep learning model according to the automatic encoder and the twin network, and training the deep learning model; extracting feature descriptors of the local camera image block and the local rendering image block to be matched based on the trained deep learning model, and performing cross-source image matching on the local camera image block and the local rendering image block to be matched according to the extracted feature descriptors to obtain a cross-source image matching result; obtaining a corresponding relationship of the cross-source images according to a cross-source image matching result, and calculating a virtual-real registration transformation relationship according to the corresponding relationship; the application of the outdoor augmented reality is realized according to the virtual-real registration transformation relationship, so that the augmented reality effect is improved.

Description

Technical field [0001] The present invention relates to the technical field of outdoor augmented reality, in particular to an outdoor augmented reality application method based on cross-source image matching, a computer-readable storage medium and a computer device. Background technique [0002] In related technologies, the application of augmented reality is mainly concentrated in indoor scenes, and pre-placed markers are used to assist virtual and real registration. However, in outdoor scenes, due to the increased scale and complexity of outdoor scenes, it is not realistic to use pre-placed markers Therefore, most outdoor augmented reality applications are usually based on sensor-based positioning and vision methods, and are mainly used in static scenes, and the fusion accuracy of multiple sensors is not robust to light changes and occlusion, which affects the effect of augmented reality . Summary of the invention [0003] The present invention aims to solve one of the technica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06T15/00G06T17/20
CPCG06T19/006G06T15/005G06T17/20
Inventor 王程刘伟权卞学胜沈雪仑赖柏锜李渊李永川贾宏
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products