Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Autonomous motion planning method for large space manipulator based on multi-channel vision fusion

A technology of motion planning and autonomous movement, applied in the field of manned spaceflight, to achieve the effect of solving the difficulty of capture, stable and accurate capture, and precise positioning and capture

Active Publication Date: 2013-06-19
BEIJING INST OF SPACECRAFT SYST ENG
View PDF0 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] At present, vision-based autonomous motion planning technology has been researched, but on the one hand, there is no research on space applications. On the other hand, existing vision-based autonomous planning mostly uses single-channel vision (monocular or binocular), which is limited For target recognition in a small range, for a large space manipulator with a total length of more than ten meters, to complete autonomous movement and autonomous capture, it needs functions such as target recognition in a wide range, recognition for moving targets, and precise positioning at close range. It is impossible to complete the task through one-way vision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous motion planning method for large space manipulator based on multi-channel vision fusion
  • Autonomous motion planning method for large space manipulator based on multi-channel vision fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The multi-channel vision system of the present invention includes a global vision system, a local vision system and a wrist vision system, wherein the global vision system is installed on the cabin base, and the local vision system is installed on the mechanical arm body such as a part or an arm bar. The vision system is installed at the end of the robot arm, and both the local vision system and the wrist vision system move with the body of the robot arm. The multi-channel vision system is composed of a camera, an image compression module and a pose information processing module, but the field of view of the global, local and wrist cameras is different, and these are mature technologies.

[0018] In the present invention, the focal length of the global camera in the global vision system is 8mm. When the distance target is [5m, 20m], the position measurement accuracy is better than 120mm, and the attitude measurement accuracy is better than 5°; The focal length is 20mm. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The autonomous motion planning method of large-scale space manipulator based on multi-channel vision fusion. In multi-channel vision fusion, the priority of the global vision system is lower than that of the local vision system, and the priority of the local vision system is lower than that of the wrist vision system. The global vision system can obtain the pose information of the target object, but neither the local vision system nor the wrist vision system can obtain the pose information of the target object, so the target pose information obtained by the global vision system is used for motion planning; For the movement of the arm body, if the local vision system can obtain the pose information of the target object, before the manipulator body reaches the middle point of tracking, the target pose information obtained by the local vision system is used for motion planning; when the manipulator body moves to the tracking At the middle point, at this time, the wrist vision system has been able to obtain the pose information of the target object, and the motion planning is performed using the target pose information obtained by the wrist vision system. The invention realizes the tasks of autonomous movement, autonomous tracking and autonomous capture of target objects in a large space range.

Description

technical field [0001] The invention relates to a method for autonomous motion planning of a large-scale space manipulator based on multi-channel vision fusion, and belongs to the field of manned spaceflight. Background technique [0002] With the rapid development of space science and technology, a large number of space tasks need to be completed urgently, such as the assembly and maintenance of the space station, assisting astronauts to go out of the cabin, sampling the surface of the moon, etc. The completion of these tasks is inseparable from the necessary tool of the robotic arm. , so the robotic arm technology for space applications has become a new research field and has become a hot research issue. At present, research on space manipulators in the world has been carried out relatively early and has been successfully applied. For example, the space shuttle manipulators developed by Canada and the manipulators of the International Space Station have successfully comple...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
Inventor 肖涛王耀兵胡成威姜水清史文华金俨张晓东
Owner BEIJING INST OF SPACECRAFT SYST ENG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products