Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion

A multi-source information fusion, non-cooperative target technology, applied in the field of non-cooperative target relative navigation motion estimation

Active Publication Date: 2018-09-04
BEIJING INST OF SPACECRAFT SYST ENG
View PDF10 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Compared with visual cameras, laser scanners have the advantages of higher-precision depth information, large field of view, and simple data processing. However, they can only obtain discrete point cloud data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion
  • Non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion
  • Non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0077] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0078] The basic idea of ​​the present invention is: a non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion. Firstly, a joint calibration method of the vision-laser measurement system is given to obtain the relative position between the two sensor coordinate systems. pose relationship; further obtain the image information of the visual camera and the point cloud information of the 3D laser scanning surface, and fuse the image information and the point cloud information through the interpolation method; finally, obtain the kinematic parameters through the feature points to further complete the 3D motion estimation of the target.

[0079] The binocular vision camera is installed on the installation platform to ensure that the spatial position and posture of the binocular ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion. The method comprises the following steps of: givinga joint calibration method of a laser and visual measurement system, obtaining the reprojection information of a laser scanning point cloud in camera coordinate system based on the joint calibration system, respectively acquiring the depth compensation information of the target point and the compensated depth information by using a quadratic interpolation method, realizing information fusion between the laser scanning point cloud and the vision camera image, and finally acquiring the motion estimation of the target on the basis of the fusion information. The method provides a fusion method ofinformation of a vision camera and a laser scanner. The high precision depth information of the laser scanning point cloud can be used to compensate for the poor accuracy of the visual image in the depth direction, meanwhile, the defect of low resolution of laser scanning is avoided and the high precision of laser scanning depth information and high image resolution of visual measurement system are integrated. Meanwhile, the interpolation calculation method of information fusion is a simple algebraic operation, which is easy for engineering realization and application.

Description

technical field [0001] The invention relates to a non-cooperative target relative navigation motion estimation method and system based on multi-source information fusion, and belongs to the technical field of navigation motion estimation. Background technique [0002] In recent years, with the rapid development of space technology, spacecraft for various mission requirements have been sent into space one after another, and their reliability and safety have been paid more and more attention. Today, in-orbit service has gradually become an important means to prolong the service life of spacecraft, complete spacecraft equipment upgrades, and clean up space junk. It is also an important issue to be solved in future space exploration, especially for non-cooperative spacecraft developed in recent years On-orbit service technology has broad application prospects. [0003] The first problem that needs to be solved in the on-orbit service technology of non-cooperative spacecraft is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20
CPCG01C21/20
Inventor 王大轶邓润然葛东明吕原史纪鑫朱卫红邹元杰柳翠翠
Owner BEIJING INST OF SPACECRAFT SYST ENG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products