Device and method for active grasping of manipulator based on multimodal fusion

A technology for grasping devices and manipulators, applied in the direction of manipulators, program-controlled manipulators, manufacturing tools, etc., can solve the problems of lack of automatic real-time interaction and learning process, difficulty in grasping, sensor disturbance, etc., and achieve improved positioning and active grasping ability, avoid strong light interference, and improve the effect of grasping success rate

Active Publication Date: 2020-07-14
SHANGHAI JIAOTONG UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional space activities rely on equipment preset instructions, direct operation by space station staff or remote operation by ground staff, lack of automatic real-time interaction and learning process with the environment, making it difficult to achieve complex operations such as grasping moving objects in a microgravity environment Task
The existing research on automatic grasping of moving objects in a microgravity environment mainly focuses on tactile perception combined with passive compliance mechanisms to overcome the impact of moving objects in the process of grasping to improve the success rate and reliability of grasping. There are few studies on the fusion of tactile, visual and other multi-modal information to realize the active grasping operation of manipulators. The correlation and complementarity between modal sensor information is of great significance to improve the grasping efficiency and robustness

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Device and method for active grasping of manipulator based on multimodal fusion
  • Device and method for active grasping of manipulator based on multimodal fusion
  • Device and method for active grasping of manipulator based on multimodal fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that for those of ordinary skill in the art, several changes and improvements can be made without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0035] The invention aims at the problem that the binocular vision system is difficult to accurately obtain the information of the moving object to be grasped due to environmental factors such as bad light in space and electromagnetic field, and monitors the surrounding objects in a microgravity environment in real time by introducing a lidar, and uses a cyclic neural network-long-short-term memory network The algorithm is the RNN-LSTM algorithm for information fusion of radar images and v...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a mechanical arm active grabbing device and method based on multi-model fusion. The mechanical arm active grabbing device based on multi-model fusion comprises a base (1), a mechanical arm (2), a laser radar (3), a binocular vision system (4) and a mechanical arm (5), wherein the laser radar (3) and one end of the mechanical arm (2) are respectively fixedly mounted on the base (1), and the binocular vision system (4) and the mechanical arm (5) are respectively fixedly mounted at the other end of the mechanical arm. The mechanical arm active grabbing method based on multi-model fusion comprises the following steps: step 1, sensing an object to be grabbed to obtain sensing information; step 2, positioning the object to be grabbed according to the sensing information toobtain positioning information; and step 3: grabbing the object to be grabbed according to the positioning information. The mechanical arm active grabbing device and method based on multi-model fusion fully consider the complex environment of space operation, effectively improve the capability of moving object grabbing, and have a wide application prospect.

Description

Technical field [0001] The present invention relates to the technical field of space robot positioning and grasping, in particular to a manipulator active grasping device and method based on multimodal fusion, in particular to a microgravity environment that combines binocular vision, lidar and tactile perception of a CMOS camera Robot positioning and active grasping technology. Background technique [0002] At present, the major countries in the world are accelerating the development of the aerospace field, and there are more and more life science experiments and space operations for space exploration. The development of traditional space activities relies on equipment preset instructions, direct operation by space station staff, or remote operation by ground staff. The lack of automatic real-time interaction and learning with the environment makes it difficult to achieve complex operations such as grabbing moving objects in a microgravity environment. task. The existing resea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
CPCB25J9/1679B25J9/1694B25J9/1697
Inventor 王伟明马进薛腾韩鸣朔刘文海潘震宇邵全全
Owner SHANGHAI JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products