Artificial intelligence enhanced system for adaptive control driven ar/vr visual aids

an adaptive control and visual aid technology, applied in the direction of instruments, user interface execution, computing models, etc., can solve the problems of loss of mobility loss of income, etc., and achieve the effect of loss of income, loss of ability to read, and loss of mobility

Inactive Publication Date: 2019-01-10
EYEDAPTIC INC
View PDF2 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0006]For people with retinal diseases, adapting to loss a vision becomes a way of life. This impact can affect their life in many ways including loss of the ability to read, loss of income, loss of mobility and an overall degraded quality of life. However, with prevalent retinal diseases such as AMD (Age related Macular Degeneration) not all of the vision is lost, and in this case the peripheral vision remains intact as only the central vision is Impacted with the degradation of the macula. Given that the peripheral vision remains intact it is possible to take advantage of eccentric viewing and through patient adaptation to increase functionality such as reading. Research has proven that through training of the eccentric viewing increased reading ability (both accuracy and speed). Eye movement control training and PRL (Preferred Retinal Locus) training were important to achieving these results1. Another factor in increasing reading ability with those with reduced vision is the ability to views words in context as opposed to isolation. Magnification is often used as a simply visual aid with some success. However, with increased magnification comes decreased FOV (Field of View) and therefore the lack of ability to see other words or objects around the word or object of interest. Although it was proven that with extensive training isolated word reading can improve, eye control was important to this as well2. The capability to guide the training for eccentric viewing and eye movement and fixation training is important to achieve the improvement in functionality such as reading. These approaches outlined below will serve to both describe novel ways to use augmented reality techniques to both automate and improve the training.
[0008]To aid the user in targeting and fixation certain guide lines can be overlaid on reality or on the incoming image to help guide the users eye movements along the optimal path. These guidelines can be a plurality of constructs such as, but not limited to, cross hair targets, bullseye targets or linear guidelines such as singular or parallel dotted lines of a fixed or variable distance apart, a dotted line or solid box of varying colors. This will enable the user to increase their training and adaptation for eye movement control to following the tracking lines or targets as their eyes move across a scene in the case of a landscape, picture or video monitor or across a page in the case of reading text.
[0012]Also, pupil tracking algorithms can be employed and not only have eye tracking capability but can also utilize user customized offset for improved eccentric viewing capability. Whereby the eccentric viewing targets are offset guide the user to focus on their optimal area for eccentric viewing.
[0013]Further improvements in visual adaptation can be achieved through use of the hybrid distortion algorithms. With the layered distortion approach objects or words on the outskirts of the image can receive a different distortion and provide a look ahead preview to piece together words for increased reading speed. While the user is focused on the area of interest that is being manipulated the words that are moving into the focus area can help to provide context in order to interpolate and better understand what is coming for faster comprehension and contextual understanding.

Problems solved by technology

This impact can affect their life in many ways including loss of the ability to read, loss of income, loss of mobility and an overall degraded quality of life.
However, with increased magnification comes decreased FOV (Field of View) and therefore the lack of ability to see other words or objects around the word or object of interest.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Artificial intelligence enhanced system for adaptive control driven ar/vr visual aids

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018]The present inventors have discovered that low-vision users can conform a user-tuned software set and improve needed aspects of vision to enable functional vision to be restored.

[0019]Expressly incorporated by reference as if fully set forth herein are the following: U.S. Provisional Patent Application No. 62 / 530,286 filed Jul. 9, 2017, U.S. Provisional Patent Application No. 62 / 530,792 filed Jul. 9, 2017, U.S. Provisional Patent Application No. 62 / 579,657, filed Oct. 13, 2017, U.S. Provisional Patent Application No. 62 / 579,798, filed Oct. 13, 2017, Patent Cooperation Treaty Patent Application No. PCT / US17 / 62421, filed Nov. 17, 2017, U.S. NonProvisional patent application Ser. No. 15 / 817,117, filed Nov. 17, 2017, U.S. Provisional Patent Application No. 62 / 639,347, filed Mar. 6, 2018, U.S. NonProvisional patent application Ser. No. 15 / 918,884, filed Mar. 12, 2018, and U.S. Provisional Patent Application No. 62 / 677,463, filed May 29, 2018.

[0020]It is contemplated that the proces...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Interactive systems using adaptive control software and hardware from known and later developed eye-pieces to later developed head-wear to lenses, including implantable, temporarily insert-able and contact and related film based types of lenses including thin film transparent elements for housing cameras lenses and projector and functional equivalent processing tools. Simple controls, real-time updates and instant feedback allow implicit optimization of a universal model while managing complexity.

Description

CROSS REFERENCE TO RELATED APPLICATIONS[0001]This application claims the benefit of and priority to U.S. Provisional Patent Applications Ser. Nos. 62 / 530,286 & 62 / 530,792 filed July 2017, the content of each of which is incorporated herein by reference herein in its entirely, along with full reservation of all Paris convention rights.BACKGROUND OF THE DISCLOSURES[0002]The Interactive Augmented Reality (AR) Visual Aid invention described below is intended for users with visual impairments that impact field of vision (FOV). These may take the form of age-related macular degeneration, retinitis pigmentosa, diabetic retinopathy, Stargardt's disease, and other diseases where damage to part of the retina impairs vision. The invention described is novel because it not only supplies algorithms to enhance vision, but also provides simple but powerful controls and a structured process that allows the user to adjust those algorithms.[0003]The basic hardware is constructed from a non-invasive, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T19/00G06F9/451G06F3/16G06F3/01
CPCG06T19/006G06F3/012G06F3/167G06F9/453G06F3/013G06N3/006G06N20/00G06N5/04
Inventor KIM, BRIANWATOLA, DAVID A.CORMIER, JAY E.
Owner EYEDAPTIC INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products