Relative navigation method for eye movement interaction augmented reality

An augmented reality and relative navigation technology, applied in navigation, user/computer interaction input/output, surveying and navigation, etc., can solve problems such as insecurity, lack of visual interaction process, and insufficient consideration of cognition and conversion capabilities. Achieve the effect of improving the success rate of guidance, reducing distraction, and improving the navigation experience

Active Publication Date: 2019-09-27
WUHAN UNIV
View PDF4 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technical problem to be solved by the present invention is to address the defects in the prior art: the current navigation system is based on absolute coordinates, does not fully consider people's cognition and conversion ability from absolute space to relative space, and relies on the navigation system screen during the navigation process. Displaying the navigation path requires the user to look away from the road, and there are relatively large unsafe factors. In terms of sensory interaction, manual interaction and voice interaction are the main methods, and there is a lack of more intuitive visual interaction process; provide an eye movement interaction Relative Navigation Method for Augmented Reality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Relative navigation method for eye movement interaction augmented reality
  • Relative navigation method for eye movement interaction augmented reality
  • Relative navigation method for eye movement interaction augmented reality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0053] The present invention mainly realizes dynamic projection and visual dynamic relative guidance of prominent landmark virtual labels based on augmented reality and relative relationship model, and mainly realizes human-computer intelligent visual interaction between a wearable eye tracker and a navigation system. Specifically, it includes the generation of virtual labels of prominent landmarks and the calculation of real-time relative relationship attributes, based on augmented reality technology, virtual labels of prominent landmarks (including relative relationship attribute information) and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a relative navigation method for eye movement interaction augmented reality. The relative navigation method comprises the following steps of 1, extracting remarkable landmark and POI coordinate information under a navigation path; 2, labeling the remarkable landmark and generating a virtual label according to set landmark classification and a label model; 3, calculating immediate relative azimuth attribute information of the remarkable landmark by a relative azimuth calculation model, and endowing the virtual label with the immediate relative azimuth attribute information; 4, achieving relative azimuth mapping of the remarkable landmark virtual label in a reality scene relative to a user; and 5, performing visual interaction of the navigation process by a wearable eye tracker and a navigation system. By the relative navigation method, the safety risk during the navigation process can be reduced, the limitation of the user on the reasoning and cognitive ability of a two-dimensional absolute space is fully considered by the augmented reality-based relative navigation method, and the user can be effectively helped to do navigation task in a first visual angle real scene environment.

Description

technical field [0001] The invention relates to the fields of navigation technology, intelligent interaction technology and augmented reality technology, in particular to a relative navigation method for eye movement interactive augmented reality. Background technique [0002] Navigation is an important application field of GIS theory research. Current status and development of existing navigation technology: there are trajectory navigation and fuzzy navigation. During trajectory navigation, after receiving the GPS signal, the route is recorded in the form and used as a reference to guide the form of navigation. In fuzzy navigation, the route is simulated according to the plan Navigation in the real situation does not require GPS signals, and simulated navigation can be performed after setting the starting point and destination; 3D real scene navigation displays the buildings on the road in a three-dimensional form, and the effect is more realistic. Real-time video superimp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/32G01C21/36G06F3/01G06F16/29
CPCG01C21/32G01C21/3647G01C21/3664G01C21/3679G06F3/013G06F16/29G06F2203/012
Inventor 方志祥管昉立
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products