Spatial non-cooperative target pose estimation method based on model and point cloud global matching

A non-cooperative target and pose estimation technology, which is applied in the field of spatial non-cooperative target pose estimation based on model and point cloud global matching, can solve the problems of inability to guarantee the target coordinate system, poor reliability of the method, and large amount of calculation, etc., to achieve High frame rate, high reliability, fast effects

Inactive Publication Date: 2016-09-28
NANJING UNIV OF SCI & TECH
View PDF2 Cites 75 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since this method needs to identify the geometric features of non-cooperative targets, and the imaging of targets in the field of view may not have effective geometric features, it

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Spatial non-cooperative target pose estimation method based on model and point cloud global matching
  • Spatial non-cooperative target pose estimation method based on model and point cloud global matching
  • Spatial non-cooperative target pose estimation method based on model and point cloud global matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0106]In order to illustrate the present invention, it is fully demonstrated that the method has the performance of continuously and accurately obtaining the target pose and the adaptability to the camera's low resolution and noise. The pose measurement experiment is completed as follows:

[0107] (1) Experimental initial conditions and parameter settings

[0108] The simulation experiment uses a virtual camera to shoot the model point cloud to generate a data point cloud; the parameter settings of the virtual camera are consistent with the actual SR4000 camera, that is, the resolution is 144*176, the focal length is 10mm, and the pixel size is 0.04mm. Start shooting the model point cloud at 10m. The pose of the camera point cloud is set to: translate 10m along the Z axis, rotate around the Z axis from -180° to 180° at intervals of 10°, and the rest is 0.

[0109] The error calculation is: the parameters of the camera point cloud generated by simulation are used as the real v...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a spatial non-cooperative target pose estimation method based on model and point clod global matching. The method comprises the steps that target scene point cloud is acquired by using a depth camera, the target scene point cloud acts as data point cloud to be registered after being filtered, and three-dimensional distance transformation is carried out on the target model point cloud; deblurring main directional transformation is carried out on the initial data point cloud to be registered and the target model point cloud, a translation domain is determined, search and registration are carried out in the translation domain and a rotation domain by using a global ICP algorithm, and an initial transformation matrix from a model coordinate system to a camera coordinate system is acquired, namely, the initial pose of a target is acquired; a pose transformation matrix of the pervious frame is enabled to act on data point cloud of the current frame, and registration with a model is carried out by using the ICP algorithm so as to acquire the pose of the current frame; and a rotation angle and a translation amount are calculated from the pose transformation matrix. The method disclosed by the invention has good anti-noise performance and an ability of outputting the target pose in real time, geometric features such as the normal and the curvature of the data point cloud are not required to be calculated, the registration speed is high, and the precision is high.

Description

technical field [0001] The invention relates to a space non-cooperative target pose acquisition technology, in particular to a space non-cooperative target pose estimation method based on model and point cloud global matching. Background technique [0002] The measurement of the relative position and attitude parameters of space objects is one of the important technologies for realizing relative navigation between spacecraft. The commonly used measurement sensors include monocular vision, binocular vision, radar, GPS, inertial navigation, etc. Vision-based position and attitude measurement The technology has the advantages of simple structure, stable performance, and good anti-interference, and has always been a hot topic of research. In the close-range stage, methods based on optical imaging and image processing outperform other methods. [0003] Space non-cooperative targets generally refer to space spacecraft that cannot provide effective cooperative information, whose s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00
CPCG06T2207/10028
Inventor 赵高鹏杨滨华刘鲁江何莉君章婷婷薄煜明王建宇曹飞
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products