Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor navigation method based on augmented reality

An augmented reality and indoor navigation technology, applied in the field of indoor navigation based on augmented reality, can solve problems such as poor signal strength accuracy, indoor navigation positioning, and aesthetic impact, and achieve high accuracy, strong fault tolerance, and fast speed

Pending Publication Date: 2021-09-10
SHENYANG INST OF ENG
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Unlike outdoors, which can use GPS for precise navigation, indoor navigation cannot use GPS for positioning. Therefore, indoor navigation is currently mostly based on QR codes or Bluetooth devices.
Among them, the two-dimensional code positioning method needs to post a large number of two-dimensional codes indoors, which affects the appearance, and the two-dimensional codes are easy to be modified; a large number of bluetooth devices need to be placed for positioning through bluetooth devices, and by detecting the communication signal strength between different bluetooth devices locate
This method is not only costly, but also the signal strength is easily affected by external interference, resulting in poor accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor navigation method based on augmented reality
  • Indoor navigation method based on augmented reality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0020] Such as figure 1 As shown, the present invention completes the definition and editing of the shopping mall map through the map editing function of the cloud, and provides indoor positioning and navigation functions for users through the shop face recognition function of the cloud. The present invention has the following characteristics:

[0021] (1) The cloud map editing function is the basis of positioning and navigation functions, responsible for the definition and editing of shopping mall maps. The fully enclosed area in the map is a no-passing area, and shops, stairs, toilets, etc. are all fully enclosed areas. Each fully enclosed area can define a separate logo for positioning and navigation. The all-communication area is the passable area for users, and positioning and navigation can only be performed in the all-communication area outside the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of augmented reality and the field of indoor navigation, in particular to an indoor navigation method based on augmented reality, which can be applied to a shopping mall. According to the invention, a shop door face is used as a basis for positioning and navigation, and the system is composed of user-side software and cloud-side software. The user side software is used for acquiring a shop door face image from the camera, transmitting the shop door face image to the cloud side, downloading positioning data from the cloud side and drawing AR navigation information. The cloud software is provided with a deep learning network which completes training and is used for identifying the shop door face. In addition, the cloud software is responsible for editing and storing shopping mall map data and transmitting positioning information. The method comprises the following steps: shooting a shop door face through user side software, transmitting an image to a cloud deep learning network for identification, and returning an identification result; enabling the user side software to retrieve a corresponding shop in the map data according to an identification result returned by the cloud side, and position a user position in a channel in front of a shop door; and enabling a user to input a destination in user side software, and generating AR navigation information according to map data through a current positioning position and a destination position for navigation.

Description

technical field [0001] The invention relates to the fields of augmented reality and indoor navigation, in particular to an augmented reality-based indoor navigation method applicable in shopping malls. Background technique [0002] Unlike outdoors, which can be accurately navigated by GPS, indoor navigation cannot be positioned by GPS. Therefore, indoor navigation is currently mostly positioned by QR codes or Bluetooth devices. Among them, the two-dimensional code positioning method needs to post a large number of two-dimensional codes indoors, which affects the appearance, and the two-dimensional codes are easy to be modified; a large number of bluetooth devices need to be placed for positioning through bluetooth devices, and by detecting the communication signal strength between different bluetooth devices to locate. This method is not only costly, but also the signal strength is easily affected by external interference, resulting in poor accuracy. Contents of the inven...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20G06K9/00H04L29/08
CPCG01C21/206H04L67/025H04L67/06H04L67/12H04L67/52
Inventor 李波王祥凤高强
Owner SHENYANG INST OF ENG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products