Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous optical navigation method for soft landing for deep space probe

An independent technology for deep space probes and soft landings, applied in the aerospace field, can solve problems such as high system complexity and inability to support high-precision soft landing tasks

Inactive Publication Date: 2010-06-30
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF0 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the problem that the existing autonomous optical navigation technology has a high system complexity and cannot support the completion of high-precision soft landing tasks. Aiming at the requirements of autonomy and real-time performance of deep space soft landing tasks, a deep space navigation system is proposed. Autonomous Optical Navigation Method for Probe Soft Landing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous optical navigation method for soft landing for deep space probe
  • Autonomous optical navigation method for soft landing for deep space probe
  • Autonomous optical navigation method for soft landing for deep space probe

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] In order to better illustrate the purpose and advantages of the present invention, the present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0040] The concrete steps of this embodiment are as follows:

[0041] Step 1, read the corresponding pixel p and image line l coordinates of the target landing point on the image plane captured by the optical navigation camera, and the distance between the detector and the landing plane in the three laser rangefinder installation directions d i .

[0042] Use the optical navigation camera to track the pre-selected target landing point, and obtain the pixel p and image line l coordinates corresponding to the target landing point on the image plane captured by the optical navigation camera, and the pixel p and image line l coordinates are consistent with the target landing point The relationship between relative detector positions can be expressed by the followin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an autonomous optical navigation method for soft landing of a deep space probe, and belongs to the field of space flight and aviation. The autonomous optical navigation method comprises the following steps: firstly, reading a corresponding pixel and a pixel line coordinate of a target landing point on an image plane photographed by an optical navigation camera, and the distance of the probe from a landing plane in three laser ranging device mounting directions; secondly, determining the posture of the probe relative to a landing plane of a target astronomical body by using the obtained distance measured by the three laser ranging devices and the mounting azimuth angle and pitch angle of the known ranging device; thirdly, determining the position relationship between the probe and the target landing point by using the obtained distance di measured by the three laser ranging devices and the pixel and the pixel line coordinate of the target landing point; and finally, performing filter estimation on the position, speed, posture and angular speed information of the probe relative to the landing area. The autonomous optical navigation method for the soft landing of the deep space probe has the characteristics of high reliability, low cost and strong real-time, and can highly precisely determine the position and posture of the probe relative to the target landing point.

Description

technical field [0001] The invention relates to an autonomous navigation method of a deep space probe, in particular to a method for autonomous navigation of a deep space controller using optical information during a soft landing process, belonging to the field of aerospace. Background technique [0002] The soft landing of the probe on the target celestial body will be one of the most complex tasks in deep space exploration in the future. In order to ensure the successful completion of the landing task, it is necessary to control the probe to reach the target landing point accurately while ensuring that its relative velocity is zero and its attitude is perpendicular to the target. Therefore, it is necessary to accurately determine the position, velocity and attitude of the probe relative to the target celestial body. Due to the long communication delay, the traditional deep space network-based navigation mode can no longer meet the needs of high-precision soft landing. Due...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/24G01S17/08G01C11/00
Inventor 朱圣英崔平远徐瑞尚海滨乔栋
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products