Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Spatial non-cooperative target relative pose estimation method based on deep learning

A technology of non-cooperative targets and relative poses, which is applied in the field of relative pose estimation of spatial non-cooperative targets based on deep learning. It can solve the problems of being unable to deal with severe occlusions and complex structure of the environment model due to illumination changes, and achieve real-time and autonomy performance, improved model performance and inference estimation, less time-consuming effects

Active Publication Date: 2020-10-30
BEIHANG UNIV
View PDF3 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the requirements of real-time and autonomy in non-cooperative target pose determination and the actual needs of low power consumption and low cost of micro-satellites, and overcome the shortcomings of traditional visual measurement methods that cannot cope with severe occlusion, changing illumination environments, and complex model structures , the present invention provides a method for estimating the relative pose of a spatial non-cooperative target based on deep learning. The method uses a synthetic image and an actual image taken by a monocular camera as input, and obtains a spatial non-cooperative target by designing a convolutional neural network. Target position and attitude, complete multiple space tasks including space capture, etc., and can realize real-time estimation of the attitude and position of space non-cooperative targets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Spatial non-cooperative target relative pose estimation method based on deep learning
  • Spatial non-cooperative target relative pose estimation method based on deep learning
  • Spatial non-cooperative target relative pose estimation method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The present invention will be further described below in conjunction with the accompanying drawings and examples. It should be understood that the following examples are intended to facilitate the understanding of the present invention, and have no limiting effect on it.

[0059] The invention relates to a method for estimating the relative pose of a space non-cooperative target based on deep learning. The method uses a synthetic image and an image captured by a camera as input, and obtains the position and posture of a space non-cooperative target by designing a convolutional neural network. Multiple space missions including space capture. The present invention mainly includes the following steps: First, considering the lack of public data sets in the current spatial image pose estimation, constructing a three-dimensional model of a non-cooperative target through 3D modeling software, obtaining the data set of the non-cooperative target and dividing it as the training ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a spatial non-cooperative target relative pose estimation method based on deep learning, and the method comprises the steps: making a data set of a non-cooperative target through software, and expanding the data set; designing a target detection network based on a convolutional neural network, judging whether the target is a non-cooperative target or not, and detecting a 2Dbounding box around the non-cooperative target; segmenting the non-cooperative target around the 2D bounding box, searching the center of the non-cooperative target based on Kirchhoff, estimating thedepth from the center of the non-cooperative target to the camera, and converting 2D pixel coordinates into 3D coordinates in combination with a depth value; designing a key point extraction network,extracting key points such as corner points and three-axis end points of the non-cooperative target, and performing regression on the key points to obtain a rotation quaternion to represent rotation;and finely tuning the estimated relative pose through iterative optimization. According to the method, the conditions of severe shielding, sensor noise, low estimation precision during object symmetry and the like can be processed, and meanwhile, the real-time speed requirement can be met.

Description

technical field [0001] The invention belongs to the field of spacecraft navigation, in particular to a method for estimating the relative pose of a space non-cooperative target based on deep learning. Background technique [0002] Attitude determination is to determine the attitude of the body with star sensors and gyroscopes. At present, most of the tasks faced are maintenance of failed spacecraft and capture of out-of-control spacecraft, and the research objects are non-cooperative targets in space. This kind of space non-cooperative target often rolls rapidly out of control in space, so it is necessary to obtain the position and attitude of the non-cooperative target in the case of unknown appearance characteristics, no response, and no identification. Collect images of non-cooperative targets through a monocular camera, provide a large number of real-time images, and then use the collected image information to estimate the structure and motion of the target. The existing...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/70G06T7/11G06T17/00G06F30/15G06F30/27G06N3/04
CPCG06T7/70G06T7/11G06T17/00G06F30/15G06F30/27G06N3/045Y02T90/00Y02T10/40
Inventor 胡庆雷郇文秀郑建英郭雷
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products