Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

3D Pose Estimation Method of Space Object Based on Image Sequence

A technology of three-dimensional posture and space object, which is applied in image analysis, image enhancement, image data processing, etc., to achieve the effect of simplifying the difficulty and achieving good results

Inactive Publication Date: 2021-09-28
HARBIN INST OF TECH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The present invention aims to solve the problem that the existing technology cannot quickly determine the three-dimensional attitude of the space target by adopting a method with strong anti-interference ability and good robustness while receiving less interference, and proposes a three-dimensional space target posture based on image sequences. Pose Estimation Method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D Pose Estimation Method of Space Object Based on Image Sequence
  • 3D Pose Estimation Method of Space Object Based on Image Sequence
  • 3D Pose Estimation Method of Space Object Based on Image Sequence

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0034] Specific implementation mode 1. Combination figure 2 Describe this embodiment, the method for estimating the three-dimensional pose of a space object based on image sequences described in this embodiment, the specific process of this method is:

[0035] Step 1. Preprocessing the observation image;

[0036] Step 2, image acquisition is performed on the three-dimensional attitude of the space target, and an object matching image library is obtained;

[0037] Step 3, using the scale-invariant feature algorithm to match the observed image preprocessed in step 1 with the image in the target matching image library in step 2, and filter to obtain the most similar image;

[0038] Step 4, reversely calculate the three-dimensional attitude parameter value of the space object, and output the attitude angle of the space object.

[0039] In this embodiment, Professor David G. Lowe of the University of British Columbia proposed a feature detection method based on invariant technol...

specific Embodiment approach 2

[0040] Specific embodiment 2. This embodiment is a further description of specific embodiment 1. The preprocessing of the observed image in step 1 is to perform noise reduction or enhancement processing on the observed image.

[0041] In this embodiment, the purpose of preprocessing is to obtain a better feature point extraction effect when using the SIFT algorithm.

specific Embodiment approach 3

[0042] Specific implementation mode three, this implementation mode is to further explain specific implementation mode one, the specific process of obtaining the target matching image library described in step 2 is:

[0043] Obtain a 3D model of a space target, define the initial 3D attitude parameter value (0,0,0) for the 3D model, change the initial 3D attitude parameter value (0,0,0), construct a model library, and obtain a target matching image library .

[0044] In this embodiment, the images in the object matching image library contain the three-dimensional attitude parameter information of the space object, and each image can independently determine the three-dimensional attitude parameters of the space object in the current state.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The image sequence-based three-dimensional attitude estimation method of space objects belongs to the field of three-dimensional attitude estimation and monitoring of space objects. The present invention solves the problem that the prior art cannot adopt a method with strong anti-interference ability and good robustness while receiving less interference. The problem of quickly determining the three-dimensional pose of a space object. The specific process of the present invention is: step 1, preprocessing the observation image; step 2, image acquisition of the three-dimensional attitude of the space object, and obtaining the target matching image library; step 3, using the scale invariant feature algorithm to preprocess the step 1 The final observation image is matched with the image in the target matching image library in step 2, and the most similar image is obtained by screening; step 4, reversely calculate the three-dimensional attitude parameter value of the space object, and output the attitude angle of the space object. The invention is used for three-dimensional pose estimation and monitoring of space targets.

Description

technical field [0001] The invention relates to a method for estimating a three-dimensional posture of a space target, belonging to the field of three-dimensional posture estimation and monitoring of a space target. Background technique [0002] The pose of a 3D object intuitively reflects the characteristics of a 3D object, so the 3D pose, as one of the important characteristics of a 3D object, has always been the research focus of researchers at home and abroad. If it is possible to judge the three-dimensional attitude of the space target, the purpose of the space target can be roughly judged, and the space target can be classified well. In recent years, with the rise of human space exploration activities, the frequency of air operations has become higher and higher, and space activities have higher and higher requirements for target control accuracy. After detecting a space target, how to judge whether it is a useful target or whether it is harmful to ourselves is very i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/73
CPCG06T2207/10016G06T7/74
Inventor 张亚洲张海莹周楠武京
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products