Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Aircraft landing pose estimation method based on vision-inertia tight coupling

A pose estimation and aircraft technology, applied in the field of integrated navigation, can solve the problems of uncertain calculation time, long calculation time, poor solution accuracy, etc., and achieve strong robustness, low design and maintenance costs, and high pose accuracy Effect

Pending Publication Date: 2019-02-15
XIAN AVIATION COMPUTING TECH RES INST OF AVIATION IND CORP OF CHINA
View PDF7 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Traditional aircraft approach and landing navigation often use Kalman filter (KF), PnP method or nonlinear optimization (ULO). When the visual measurement model constructed is a nonlinear time-varying system, even if extended Kalman filter (EKF) (PF) is also difficult to significantly improve the filtering accuracy, and the calculation time is longer; the PnP algorithm is greatly affected by the accuracy of target detection in the image, and depends on the number of image features. When the number of image features is small (such as <5) pose The solution accuracy is poor; the nonlinear optimization method finds the optimal solution through multiple iterations to achieve the minimum value of the objective function, and its calculation time is uncertain, which cannot meet the strong real-time requirements of airborne applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Aircraft landing pose estimation method based on vision-inertia tight coupling
  • Aircraft landing pose estimation method based on vision-inertia tight coupling
  • Aircraft landing pose estimation method based on vision-inertia tight coupling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] As mentioned above, the aircraft landing pose estimation method based on visual-inertial tight coupling of the present invention mainly includes the following processes:

[0016] 1. Framework of aircraft landing pose estimation method based on visual-inertial tight coupling

[0017] A complete vision-assisted inertial navigation system includes image sensors, inertial navigation units, onboard databases, graphics and image processing components, and navigation display terminals to support pose estimation during approach and landing phases. Among them, the image sensor can be a visible light camera (VIS), a short-wave infrared camera (SWIR), a long-wave infrared camera (LWIR) or a combination thereof, which is used to obtain a downward-looking or front-down-looking image; the inertial measurement unit can be an inertial navigation system ( INS) or Heading and Attitude Reference System (AHRS), etc., are used to obtain the motion state of the aircraft; the airborne databas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an aircraft landing pose estimation method based on vision-inertia tight coupling. The method comprises the steps of: an actual feature point detection process: acquiring video,enhancing an image, detecting a target and outputting a feature point, a synthesized feature point generation process: reading a pose parameter of an inertia measurement unit, reading geographic information of an airport landmark point and calculating a synthesized feature point, and a relative pose solution process: reading the actual feature point and the synthesized feature point, reading thegeographic information of the airport landmark point, and calculating a relative pose between an aircraft and a landing platform.

Description

technical field [0001] The invention relates to the field of integrated navigation, in particular to an aircraft landing pose estimation method based on visual-inertial tight coupling. Background technique [0002] Traditional aircraft approach and landing navigation often use Kalman filter (KF), PnP method or nonlinear optimization (ULO). When the visual measurement model constructed is a nonlinear time-varying system, even if extended Kalman filter (EKF) (PF) is also difficult to significantly improve the filtering accuracy, and the calculation time is longer; the PnP algorithm is greatly affected by the accuracy of target detection in the image, and depends on the number of image features. When the number of image features is small (such as <5) pose The solution accuracy is poor; the nonlinear optimization method finds the optimal solution through multiple iterations to achieve the minimum value of the objective function, and its calculation time is uncertain, which ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C11/36
CPCG01C11/00G01C21/165G01C11/36
Inventor 张磊牛文生余冠锋
Owner XIAN AVIATION COMPUTING TECH RES INST OF AVIATION IND CORP OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products