Fixed wing aircraft vision-auxiliary landing navigation method under low visibility

A low-visibility, navigation method technology, applied in directions such as navigation, navigation by speed/acceleration measurement, mapping and navigation, etc., which can solve the problem of being susceptible to reflections from surrounding terrain, GPS signals being susceptible to interference or shielding, and low ILS navigation accuracy. and other problems, to achieve the effect of significant perspective effect, low cost, and elimination of inertial accumulation errors.

Active Publication Date: 2019-02-15
XIAN AVIATION COMPUTING TECH RES INST OF AVIATION IND CORP OF CHINA
View PDF11 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, the ILS navigation accuracy is low, it is easily affected by the reflection of the surrounding terrain, and the software, hardware and maintenance costs are high, so it is n

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fixed wing aircraft vision-auxiliary landing navigation method under low visibility
  • Fixed wing aircraft vision-auxiliary landing navigation method under low visibility
  • Fixed wing aircraft vision-auxiliary landing navigation method under low visibility

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] In the fixed-wing aircraft vision-assisted landing navigation method under low visibility of the present invention, mainly include the following aspects:

[0018] 1. Visual landing navigation method framework

[0019] The input data of this method comes from the airborne inertial measurement unit (IMU), airborne forward-looking infrared camera (FLIR) and airborne navigation database, and the output data is the corrected position and attitude. The whole algorithm includes video acquisition, runway area Main parts such as selection (ROI), runway detection, runway synthesis, relative pose calculation, visual and inertial fusion, pose correction, etc., see the flow chart for details figure 1 . The specific information processing flow is as follows:

[0020] 1) Infrared video data stream: After the infrared video captured by FLIR is collected, the ROI is selected from the entire image by using the method assisted by inertial parameters, and then the image features of the f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present disclosure provides a fixed wing aircraft vision-auxiliary landing navigation method under low visibility. The method comprises a visual feature extraction process, a relative position andattitude solution process, and a visual inertia fusion process, wherein the visual feature extraction process comprises the following steps: receiving position and attitude parameters output by an inertial measurement unit, collecting 2D images, extracting runway areas from the 2D images, performing straight-line detection in the runway areas, extracting the edge lines of the runway areas, and calculating pixel coordinates of four corner points of a runway; the relative position and attitude solution process comprises the following steps: inputting the geographic coordinates and pixel coordinates of the four corner points of the runway, and calculating the relative position and attitude of a camera relative to the runway combined with internal parameters of the camera; the visual inertiafusion process comprises the following steps: defining system states, establishing an inertial error transfer equation, and obtaining measurement information, and nonlinear Kalman filtering and correcting position and attitude parameters.

Description

technical field [0001] The invention relates to a landing navigation method, in particular to a vision-assisted landing navigation method for a fixed-wing aircraft under low visibility. Background technique [0002] Currently widely used assisted landing navigation technologies include instrument landing system (ILS) and inertial / GPS (INS / GPS) integrated navigation. Among them, the ILS navigation accuracy is low, it is easily affected by the reflection of the surrounding terrain, and the software, hardware and maintenance costs are high, so it is not suitable for mountainous airports or general airports. Although the accuracy of INS / GPS is high, the GPS signal is easily interfered or shielded, and its navigation reliability is not high. Aiming at the problems of low precision and poor reliability of the existing landing navigation, by using the infrared camera’s remarkable perspective effect under low visibility conditions, image processing technology is used to extract vis...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/20G01C21/16G01C11/36
CPCG01C11/00G01C21/005G01C21/165G01C21/20G01C11/36
Inventor 张磊牛文生刘硕窦爱萍吴志川
Owner XIAN AVIATION COMPUTING TECH RES INST OF AVIATION IND CORP OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products