Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Spatial non-cooperative target pose estimation method based on deep learning

A non-cooperative target and pose estimation technology, applied in neural learning methods, calculations, computer components, etc., can solve problems such as large attitude errors and few data sets, and achieve the effect of meeting real-time requirements

Active Publication Date: 2021-04-13
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Thaweerath Phisannupawong et al. used quaternions to represent the attitude of the spacecraft, and used GoogLeNet as the backbone network to estimate the attitude. This method uses less data sets, but the attitude error is larger.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Spatial non-cooperative target pose estimation method based on deep learning
  • Spatial non-cooperative target pose estimation method based on deep learning
  • Spatial non-cooperative target pose estimation method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0087] In this embodiment, the space non-cooperative target is selected as a spacecraft with a known three-dimensional model, and the pose estimation of the target spacecraft is performed. The implementation steps are as follows:

[0088] (1) Divide the pose category intervals, and make training sets, verification sets, and test sets.

[0089] (1-1) Divide the attitude category interval: set the angle classification interval threshold δ=6°, and divide the attitude category at an interval of 6°. Estimated angle minimum θ min =-30°, estimated maximum angle θ max = 30°. Angle interval [-33°, -27°) is attitude category 0, angle interval [-27°, -21°) is attitude category 1, angle interval [-21°, -15°) is attitude category 2, angle interval [-15°, -9°) is attitude category 3, the angle interval [-9°, -3°) is attitude category 4, the angle interval [-3°, 3°) is attitude category 5, and the angle interval [3° ,9°) is attitude category 6, the angle interval [9°,15°) is attitude ca...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a spatial non-cooperative target pose estimation method based on deep learning, and the method comprises the steps: 1, dividing pose category intervals, generating spatial non-cooperative target image data, and marking a pose category label, a pose numerical value label and a position label, and obtaining a marking data set of a spatial non-cooperative target, which comprises a training set, a test set and a verification set; 2, constructing a neural network applied to spatial non-cooperative target pose estimation based on an AlexNet network, removing full connection layers at the tail end of the network, and then connecting the four full connection layers in parallel; 3, designing loss functions of four branches; 4, inputting the training set and the verification set into the constructed neural network, training the network by utilizing a designed loss function, and storing a neural network model when the loss function is converged to a global minimum value; and 5, performing pose estimation on the spatial non-cooperative target by using the trained neural network model. According to the invention, pose estimation of a spatial non-cooperative target can be realized through a single camera and a single image.

Description

technical field [0001] The invention belongs to the technical field of spatial non-cooperative target pose estimation, and in particular relates to a deep learning-based method for spatial non-cooperative target pose estimation. Background technique [0002] With the development of space technology, space on-orbit service has become an important means to ensure the stable operation of space vehicles in complex space environments. When the space on-orbit service performs close-range operations such as docking and maintenance, the relative attitude measurement of space objects is one of the key technologies that need to be solved. Space targets can be divided into two types: cooperative targets and non-cooperative targets. Cooperative targets can communicate by radio or have cooperative markings to help determine attitude, while non-cooperative targets cannot provide cooperative information. In recent years, with the increase of inactive satellites in low earth orbit and spa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06T7/73G06N3/04G06N3/08
CPCG06T7/75G06N3/084G06T2207/20081G06T2207/20084G06N3/045G06F18/2415
Inventor 佘浩平杨兴昊李海超宋建梅
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products