Unlock instant, AI-driven research and patent intelligence for your innovation.

System and method for augmented reality visualization based on sensor data

An augmented reality and sensor technology, applied in the input/output process of data processing, electrical digital data processing, instruments, etc., can solve problems such as accessing information

Pending Publication Date: 2022-05-17
THE BOEING CO
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

size and weight constraints may limit the number of fixed displays, so some crew members may not be able to access information as and when needed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method for augmented reality visualization based on sensor data
  • System and method for augmented reality visualization based on sensor data
  • System and method for augmented reality visualization based on sensor data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023]Embodiments described herein are directed to augmented reality visualizations. In one particular example, a user in a vehicle wears an augmented reality (AR) headset. A device receives headset sensor data from one or more headset sensors coupled to an AR headset. The device also receives vehicle sensor data from one or more vehicle sensors coupled to the vehicle. The device's user movement estimator determines the user's portion of the AR headset's movement caused by movement of the user's head rather than by movement of the vehicle. In one particular example, the user turns 5 degrees to look in the direction of the dashboard in the vehicle, and the vehicle turns left. In this example, vehicle sensor data indicates that the vehicle has turned 90 degrees. Headset sensor data indicates that the AR headset has been turned 95 degrees (90 degrees due to the vehicle's movement + 5 degrees due to the user's movement). The user movement estimator determines the user portion ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a system and method for augmented reality visualization based on sensor data. A device is configured to estimate a gaze target of a user of an augmented reality headset based on vehicle sensor data and headset sensor data. The device is configured to generate visualization data based on the gaze target. In response to determining that the gaze target is inside the first vehicle, the visualization data includes a first visual depiction of the first point of interest outside the first vehicle. The first point of interest includes at least a portion of a particular route of a particular vehicle. The particular vehicle includes a first vehicle or a second vehicle. In response to determining that the gaze target is outside the first vehicle, the visualization data includes a second visual depiction of a second point of interest inside the first vehicle.

Description

technical field [0001] The present disclosure relates generally to augmented reality visualizations. Background technique [0002] Augmented reality (AR) is rapidly changing the way people interact with computer systems and environments. The technology is expected to broadly impact aerospace and defense. Aircrews (such as those of commercial aircraft, military aircraft, ships, and ground vehicles) typically maintain situational awareness between two different situations: information presented primarily on fixed two-dimensional computer displays and three-dimensional external environment. Crew members transition between these two situations by redirecting their gaze or by physically moving between the console and the window. Mental shifts can also occur, for example, when the crew tries to map between two-dimensional graphics and three-dimensional terrain. Size and weight constraints may limit the number of fixed displays, so some crew members may not be able to access in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01
CPCG06F3/013G06F2203/012B60K35/00G02B2027/014B60K35/10B60K2360/149B60K35/23B60K35/28B60K2360/166B60K2360/177B60K35/29B60K2360/1868B60K2360/178B60K2360/191B60K35/285
Inventor J·J·邦南W·J·伍德T·J·斯特普
Owner THE BOEING CO
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More