Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unified framework for precise vision-aided navigation

a vision-aided navigation and unified framework technology, applied in the field of visual odometry, can solve the problems of insufficient assistance for users, inability to reliably work with gps, and most of the available navigation systems to function efficiently, so as to reduce or eliminate the accumulation of navigation drift, improve visual odometry performance, and improve the effect of visual odometry performan

Active Publication Date: 2012-05-08
SRI INTERNATIONAL
View PDF5 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The present invention provides a system and method for real-time capture of visual sensor data and the ability to accurately locate an object in a GPS-denied or GPS-challenged environment. The system uses video cameras and other sensors to track features in the environment and estimate the position and orientation of the sensor system in 3D. The system can also integrate information from secondary sensors like IMUs and GPS units to improve localization accuracy. The invention also includes a landmark recognition system that can identify salient landmarks in the environment and match them against a database of previously identified landmarks. The system can record detailed information and imagery about the environment and assist in planning and implementing future navigation tasks.

Problems solved by technology

However, most of the available navigation systems do not function efficiently and fail frequently under certain circumstances.
However, GPS is limited in that it cannot work reliably once the satellite signals are blocked or unavailable in, for example, “GPS-denied” environments such as indoors, forests, urban areas, etc.
Even when operating properly, GPS can only provide the location of the user, which is not sufficient to assist the user during navigation.
In addition, conventional systems including multiple cameras or other visual sensing devices provide for limited ability to perform comprehensive visual odometry.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unified framework for precise vision-aided navigation
  • Unified framework for precise vision-aided navigation
  • Unified framework for precise vision-aided navigation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034]The present invention relates to vision-based navigation systems and methods for determining location and navigational information related to a user and / or other object of interest. An overall system and functional flow diagram according to an embodiment of the present invention is shown in FIG. 4. The systems and methods of the present invention provide for the real-time capture of visual data using a multi-camera framework and multi-camera visual odometry (described below in detail with reference to FIGS. 3, 5, and 6); integration of visual odometry with secondary measurement sensors (e.g., an inertial measurement unit (IMU) and / or a GPS unit) (described in detail with reference to FIG. 3); global landmark recognition including landmark extraction, landmark matching, and landmark database management and searching (described in detail with reference to FIGS. 3, 7, 8, and 9).

[0035]As shown in FIG. 4, the vision-based navigation system 1 (herein referred to as the “Navigation S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A system and method for efficiently locating in 3D an object of interest in a target scene using video information captured by a plurality of cameras. The system and method provide for multi-camera visual odometry wherein pose estimates are generated for each camera by all of the cameras in the multi-camera configuration. Furthermore, the system and method can locate and identify salient landmarks in the target scene using any of the cameras in the multi-camera configuration and compare the identified landmark against a database of previously identified landmarks. In addition, the system and method provide for the integration of video-based pose estimations with position measurement data captured by one or more secondary measurement sensors, such as, for example, Inertial Measurement Units (IMUs) and Global Positioning System (GPS) units.

Description

CROSS-REFERENCE TO RELATED APPLICATION[0001]This application claims the benefit of U.S. Provisional Application No. 60 / 868,227, filed on Dec. 1, 2006. U.S. Provisional Application No. 60 / 868,227 is incorporated by reference herein.STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT[0002]The U.S. Government has a paid-up license in this invention and the right in limited circumstances to require the patent owner to license others on reasonable terms as provided for by the terms of Contract No. M67854-05-C-8006 awarded by the Office of Naval Research.FIELD OF THE INVENTION[0003]The present invention relates generally to visual odometry, particularly to a system and method for enhanced route visualization in mission planning applications.BACKGROUND OF THE INVENTION[0004]Precise and efficient navigation systems are very important for many applications involving location identification, route planning, autonomous robot navigation, unknown environment map building, etc. Howeve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): H04N7/00H04N7/18G01C22/00G01C21/30H04N13/243
CPCG01C21/005G06T7/0044G06K9/3216G01C21/165G06T7/74H04N13/239H04N13/243G06V10/245G01C21/1656G06V20/56G06F18/22H04N2013/0081
Inventor SAMARASEKERA, SUPUNKUMAR, RAKESHOSKIPER, TARAGAYZHU, ZHIWEINARODITSKY, OLEGSAWHNEY, HARPREET
Owner SRI INTERNATIONAL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products