Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Robust real-time on-line camera tracking method

A camera, robust technology, applied in image communication, computer components, color TV components, etc., can solve problems such as camera shake, image blur, and insufficient resistance, so as to avoid unreliable estimation and reduce errors and mistakes , the effect of improving time efficiency

Inactive Publication Date: 2011-05-25
BEIHANG UNIV
View PDF4 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the simultaneous localization and mapping techniques only apply less prior knowledge and also brings the robustness problem of camera tracking techniques.
[0004] The robustness problem of the traditional camera tracking method mainly exists in the lack of resistance to three common problems in practical applications: (1) fast camera movement; (2) image blur caused by camera movement; (3) camera jitter
Questions 1 and 3 essentially stem from the same reason, that is, the camera tracking technology assumes continuity between the upper and lower frames.
In problem 1 and problem 3, since the motion of the camera does not strictly obey the motion model, it is likely to cause the failure of camera tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robust real-time on-line camera tracking method
  • Robust real-time on-line camera tracking method
  • Robust real-time on-line camera tracking method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] Such as figure 1 As shown, the implementation process of the present invention includes two parts, the initialization phase and the runtime phase.

[0024] The first stage: the initialization part includes five steps: selecting the input image, extracting feature points in the image, matching features between images, calculating the three-dimensional position of feature points, and establishing an initial set of feature points.

[0025] Step 1: Select an input image.

[0026] The user selects two frames of images with similar content as the initial input image to determine the starting position of the system according to their actual needs, such as the superimposed position of the virtual object in the augmented reality application, the starting position of the navigation in the autonomous navigation application, etc. . The system takes the starting position as the origin of the world coordinate system, and establishes the world coordinate system based on the common p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robust real-time on-line camera tracking method which comprises the following steps: estimating the posture of a camera and calculating and adding a key frame, wherein fast and general feature matching is adopted for the estimation of the posture of the camera, and the posture of the current frame can be estimated in a robust mode; and simultaneously tracking by means of the camera and establishing the scene mapping. By utilizing the robust real-time on-line camera tracking method, a more stable matching effect is generated under the condition of a larger scene or quick motion of the camera, and the limitation that the traditional camera tracking method is dependent on the partial matching is overcome. On the other hand, by utilizing the robust real-time on-line camera tracking method, the processing speed for the key frame is faster, so that more key frames can be contained in the scene, and the responding capability of a camera tracking algorithm for the problem that the camera is easy to fail in tracking when entering into an unknown scene is strengthened.

Description

technical field [0001] The invention belongs to the fields of computer vision and augmented reality, in particular to a real-time camera tracking method in an unknown scene. Background technique [0002] The goal of vision-based camera tracking is to estimate the pose of a camera relative to its surroundings (6 degrees of freedom parameter) from an input image sequence or real-time video. It is useful for many other computer vision applications such as 3D reconstruction, video registration and image enhancement. Traditionally, this problem has been addressed by offline structural methods for motion recovery. However, in some practical applications, such as augmented reality and autonomous navigation, real-time camera pose is a necessary prerequisite. In these cases, offline methods cannot meet the efficiency requirements, so online real-time camera tracking has received more attention in recent years. [0003] In recent years, simultaneous localization and mapping techniq...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N5/232G06K9/64
Inventor 梁晓辉乐一鸣刘洁隋秀丽
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products