Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

4965 results about "Augmented reality" patented technology

Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that fulfills three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment). This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one. Augmented reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.

Systems and methods for collaborative interactive visualization of 3D data sets over a network ("DextroNet")

Exemplary systems and methods are provided by which multiple persons in remote physical locations can collaboratively interactively visualize a 3D data set substantially simultaneously. In exemplary embodiments of the present invention, there can be, for example, a main workstation and one or more remote workstations connected via a data network. A given main workstation can be, for example, an augmented reality surgical navigation system, or a 3D visualization system, and each workstation can have the same 3D data set loaded. Additionally, a given workstation can combine real-time imagining with previously obtained 3D data, such as, for example, real-time or pre-recorded video, or information such as that provided by a managed 3D ultrasound visualization system. A user at a remote workstation can perform a given diagnostic or therapeutic procedure, such as, for example, surgical navigation or fluoroscopy, or can receive instruction from another user at a main workstation where the commonly stored 3D data set is used to illustrate the lecture. A user at a main workstation can, for example, see the virtual tools used by each remote user as well as their motions, and each remote user can, for example, see the virtual tool of the main user and its respective effects on the data set at the remote workstation. For example, the remote workstation can display the main workstation's virtual tool operating on the 3D data set at the remote workstation via a virtual control panel of said local machine in the same manner as if said virtual tool was a probe associated with that remote workstation. In exemplary embodiments of the present invention each user's virtual tools can be represented by their IP address, a distinct color, and / or other differentiating designation. In exemplary embodiments of the present invention the data network can be either low or high bandwidth. In low bandwidth embodiments a 3D data set can be pre-loaded onto each user's workstation and only the motions of a main user's virtual tool and manipulations of the data set sent over the network. In high bandwidth embodiments, for example, real-time images, such as, for example, video, ultrasound or fluoroscopic images, can be also sent over the network as well.
Owner:BRACCO IMAGINIG SPA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products