Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Adaptive switching between a vision aided inertial camera pose estimation and a vision based only camera pose estimation

A posture and vision technology, applied in the field of posture determination, can solve problems such as unpredictable degradation, poor performance, and degradation

Active Publication Date: 2015-02-18
QUALCOMM INC
View PDF3 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Nevertheless, in some cases VINS methods perform poorly compared to vision-only methods
Furthermore, the performance of the VINS method may degrade based on conditions external to the mobile device, and thus, the degradation may be unpredictable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive switching between a vision aided inertial camera pose estimation and a vision based only camera pose estimation
  • Adaptive switching between a vision aided inertial camera pose estimation and a vision based only camera pose estimation
  • Adaptive switching between a vision aided inertial camera pose estimation and a vision based only camera pose estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] figure 1 A mobile device 100 capable of detecting and tracking an object 101 by adaptively switching between a VINS method when the object 101 is stationary and a vision-only method when the object is moving is described. In some implementations, when movement of the target 101 is detected, the mobile device 100 may substantially reduce the contribution of the inertial sensor measurements rather than completely eliminate the contribution of the inertial sensor measurements.

[0020] As used herein, a mobile device refers to any portable electronic device, such as cellular or other wireless communication device, personal communication system (PCS) device, personal navigation device (PND), personal information manager (PIM), personal digital assistant ( PDA), or other suitable mobile devices, including wireless communication devices capable of capturing images of the environment, computers, laptops, tablets, etc. that can be used in vision-based tracking or VINS. A mobi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A mobile device tracks a relative pose between a camera and a target using Vision aided Inertial Navigation System (VINS), that includes a contribution from inertial sensor measurements and a contribution from vision based measurements. When the mobile device detects movement of the target, the contribution from the inertial sensor measurements to track the relative pose between the camera and the target is reduced or eliminated. Movement of the target may be detected by comparing vision only measurements from captured images and inertia based measurements to determine if a discrepancy exists indicating that the target has moved. Additionally or alternatively, movement of the target may be detected using projections of feature vectors extracted from captured images.

Description

[0001] Cross References to Related Applications [0002] This application claims Serial No. 13 / 523,634, filed June 14, 2012 and entitled "Adaptive Switching Between Vision Aided INS and Vision Only Pose" Priority is given to the US application assigned to the present assignee and incorporated herein by reference. technical field [0003] Embodiments of the subject matter described herein relate to pose determination, and more particularly, to using vision-based techniques for pose determination. Background technique [0004] In augmented reality (AR) type applications, the pose (translation and pose) of the camera relative to the imaging environment is determined and tracked. In a visual-only pose approach, captured images (eg, video frames) are used to determine and track the pose of the camera relative to feature-rich targets in the environment. For example, only the visual pose is estimated at each frame, and a statistical model is used to predict the pose at the next ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00H04N5/232
CPCG06T2207/10016G01S3/786G06T7/2053G06T7/254H04N23/60
Inventor A·拉马南达恩克里斯托福·布鲁纳M·拉马钱德兰A·狄亚吉D·克诺布劳赫穆拉利·拉马斯瓦米·查里
Owner QUALCOMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products