Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Wide-baseline visible light camera pose estimation method

A pose estimation, visible light technology, applied in computing, navigation computing tools, special data processing applications, etc., can solve problems such as difficult to directly apply to outdoor complex environments, and achieve robust detection and matching, and accurate calibration results.

Active Publication Date: 2014-12-10
NORTHWESTERN POLYTECHNICAL UNIV
View PDF2 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As far as the UAV autonomous landing system is concerned, the method based on the image features of the natural scene will be affected by unpredictable environmental factors such as weather and light, and it is difficult to directly apply it to the complex outdoor environment.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Wide-baseline visible light camera pose estimation method
  • Wide-baseline visible light camera pose estimation method
  • Wide-baseline visible light camera pose estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] Now in conjunction with embodiment the present invention will be further described:

[0028] 1. Camera internal parameter calibration

[0029] Zhang's calibration method is used to calibrate the internal parameters of each camera in the landing navigation system, that is, the focal length, the principal point coordinates, the tilt angle and the distortion coefficient and other parameters. This method belongs to the common method of camera internal parameter calibration, which is briefly described as follows: use black and white two-dimensional checkerboard as internal parameter calibration target, and place the camera at 10-15 The target is shot in different poses, and the obtained image is used as the image for calibration; corner detection is performed on the image for calibration, and the geometric relationship of the checkerboard is used to establish the corresponding relationship of each corner in the image of each viewing angle; because all corners on the checkerb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a wide-baseline visible light camera pose estimation method. The method includes the steps that firstly, the Zhang calibration method is used, and internal references of cameras are calibrated through a plane calibration board; eight datum points on a landing runway are selected in a public visual field region of the cameras and world coordinates of the datum points are accurately measured off line through a total station; in the calibration process, cooperation identification lamps are placed in the positions of the datum points and the poses of the cameras are accurately calculated through detection of the cooperation identification lamps. According to the method, the complex natural scene characteristic of an unmanned aerial vehicle landing scene and the physical light sensing characteristic of the cameras are considered, and glare flashlights are designed and used as the cooperation identification lamps of the visible light cameras; the eight datum points are arranged on the landing runway and space coordinates of the datum points are measured through the total station according to space accuracy at a 10 <-6> m level. According to the method, a calibration result is accurate and a re-projection error on an image is below 0.5 pixel.

Description

technical field [0001] The invention relates to a method for estimating the pose of a camera, in particular to a method for estimating the pose of a wide-baseline visible light camera in an autonomous landing ground guidance system for an unmanned aerial vehicle. Background technique [0002] Vision-based UAV autonomous landing navigation is a new type of navigation technology developed rapidly in recent years. Compared with traditional inertial-based and GPS-based navigation technologies, Advantages of high precision. The ground guidance system for autonomous landing of UAVs refers specifically to the guidance system in which the measurement camera is installed on the ground runway. An important task of establishing the system is to accurately calibrate the internal and external parameters of the camera under the conditions of large scenes and wide baselines. [0003] The document "Robust Multi-View Camera Calibration for Wide-Baseline Camera Networks, WACV, 2011:321-328"...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F19/00G01C11/04G01C21/20G01C25/00
Inventor 张艳宁杨涛张晓强陈挺余瑞冉令燕卓涛
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products