Planetary landing image and distance measurement fusion relative navigation method

A relative navigation, planetary technology, applied in the field of deep space exploration

Active Publication Date: 2019-01-25
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The invention can provide technical support and reference for the navigation scheme design of planetary precise soft landing missions, and solve related engineering problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Planetary landing image and distance measurement fusion relative navigation method
  • Planetary landing image and distance measurement fusion relative navigation method
  • Planetary landing image and distance measurement fusion relative navigation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to better illustrate the purpose and advantages of the present invention, the content of the invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0063] Such as figure 1 As shown, this example is aimed at the relative optical navigation method of the Mars landing dynamic descent, combining the measurement information of the optical camera and the three-beam rangefinder, and using the extended Kalman filter for filter calculation to achieve high-precision navigation during the dynamic descent. The specific implementation method of this example is as follows:

[0064] Step 1: Build the measurement model of the optical camera and rangefinder

[0065] The measurement model of the optical camera is shown in formula (1).

[0066]

[0067] In the formula, p i , l i is the pixel coordinate of the i-th feature point in the image on the image plane, f is the focal length of the camera, is the three-axis posit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a planetary landing image and distance measurement fusion relative navigation method, and belongs to the technical field of deep space exploration. The planetary landing imageand distance measurement fusion relative navigation method comprises the following steps: establishing a measurement model of a sensor; solving the position vector of a feature point according to an optical camera in the measurement model and observed quantity of a range finder; and constructing a relative navigation system by taking the solved position vector of the feature point as a navigationsystem, inputting a state equation and an observation equation in the relative navigation system into a navigation filter to obtain the position, speed and attitude information of a planetary landingdevice relative to a target landing point, and then implementing relative optical navigation of planetary landing. Dependence of optical navigation to a planetary terrain database can be avoided, furthermore, state information of the planetary landing device relative to the target landing point can be obtained, and then relative optical navigation of planetary landing is realized. Technical support and reference can be provided for a planetary accurate soft landing task navigation scheme design, and related engineering problems are solved.

Description

technical field [0001] The invention relates to a relative optical navigation method for planetary landing, which belongs to the technical field of deep space exploration. Background technique [0002] Optical navigation is a common navigation method used in planetary landings. Optical navigation for planetary landings falls into three main categories. The first category is absolute optical navigation. During the landing process, the camera carried by the lander takes images of the landing area and extracts large natural landmarks in the images. Large natural landmarks refer to terrain features that have been identified and marked in the planetary terrain database. Using large natural landmarks as a navigation reference can obtain the absolute position, velocity and attitude information of the lander in the planetary fixed coordinate system. However, due to the low altitude of the lander during planetary landing, the limited field of view of the camera, and the landing ar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/24
CPCG01C21/24
Inventor 崔平远秦同刘阳朱圣英徐瑞
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products