Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Free floating target capturing method based on 3D vision and imitation learning

A target and vision technology, applied in the field of space robots, can solve problems such as poor precision, high cost, and gaps, and achieve improved speed and accuracy, high autonomy, and simple effects

Active Publication Date: 2020-06-23
WUHAN UNIV
View PDF12 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, most of the on-orbit robots are captured by teleoperation or astronauts on-orbit auxiliary robots, which have low autonomy, and the time delay of the control signal will also bring instability, resulting in poor accuracy and high cost.
[0004] Autonomous space robots integrate multiple sensors such as vision and force, and can replace astronauts or ground operators to control and capture operations. Therefore, all aerospace powers are conducting research and demonstrations on the key technologies of space robots on-orbit capture, but currently Most of the research results are in the conceptual design stage, and there is still a big gap from the actual application

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Free floating target capturing method based on 3D vision and imitation learning
  • Free floating target capturing method based on 3D vision and imitation learning
  • Free floating target capturing method based on 3D vision and imitation learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The cooperative transport system provided by the present invention will be described in detail below in conjunction with the accompanying drawings, which is an explanation of the present invention rather than a limitation.

[0058] At first introduce method principle of the present invention, comprise the following steps:

[0059] Step 1: Vision-based pose estimation, real-time feedback of the position and pose of the target;

[0060] Step 2: Trajectory prediction, based on the historical position and attitude trajectory for a period of time, dynamically predict the position and attitude trajectory for a period of time in the future;

[0061] Step 3: Based on the trajectory planning of imitation learning, collect human capture data, build a skill model, transfer it to the robot, and determine the appropriate capture timing to capture according to the trajectory predicted in step 2.

[0062] Optionally, the capture system is based on the ROS (Robot Operating System) platf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a free floating target capturing method based on 3D vision and imitation learning. The method mainly comprises the following three parts: firstly, acquiring the position and posture of a target capture point in real time by using a depth camera; secondly, estimating a motion state by using Kalman filtering according to historical data, and predicting a track of free floating in a future period of time according to motion information; and finally, determining the capture opportunity and the pose of the capture point at the moment, collecting capture data of the person, establishing a skill model, and migrating the skill model to the robot for capture. Pose estimation, trajectory prediction and mechanical arm trajectory planning technologies in the process are completed on the basis of vision and imitation learning, autonomous capture of a free floating target is achieved, and capture requirements can be well met.

Description

technical field [0001] The invention relates to the field of space robots, in particular to a free-floating target capture method based on 3D vision and imitation learning. Background technique [0002] The pace of human development and utilization of outer space is constantly accelerating. The world launches 80 to 130 satellites every year, and about 10% of them fail due to failures. These debris occupy precious orbital resources; Its main components can still work normally when the scheduled missions and fuel are exhausted, which constitutes a serious safety hazard for other spacecraft in orbit; the use of space robots to complete tasks such as orbital debris cleaning, on-orbit maintenance, maintenance and upgrades, and space station assembly has a great impact on improving Utilization of space orbits, reducing costs, recovering economic losses, prolonging the life of spacecraft, improving the on-orbit performance of spacecraft, and improving economic benefits play an imp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20G06T7/50G06T7/70G06T7/90G06T17/00G06N20/00
CPCG06T17/00G06T7/50G06T7/90G06T7/70G06N20/00G06T7/20
Inventor 肖晓晖张勇赵尚宇汤自林
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products