Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mechanical arm autonomous moving grabbing method based on visual-tactile fusion under complex illumination condition

A technology of lighting conditions and autonomous movement, applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve the problem of inability to obtain the precise three-dimensional spatial position of the target object, prevent the instability of the fuselage, and improve the positioning accuracy and grasping of objects. The effect of accuracy

Active Publication Date: 2021-11-26
SOUTHEAST UNIV
View PDF7 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above problems, the present invention discloses a method for autonomously moving and grabbing a robotic arm based on visual-touch fusion under complex lighting conditions, which solves the problem that the existing technology cannot obtain the precise three-dimensional space position of the target object, thereby realizing precise operation; at the same time The introduction of multi-sensor modules improves the perception and friendly operation of the robot arm to the external environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mechanical arm autonomous moving grabbing method based on visual-tactile fusion under complex illumination condition
  • Mechanical arm autonomous moving grabbing method based on visual-tactile fusion under complex illumination condition
  • Mechanical arm autonomous moving grabbing method based on visual-tactile fusion under complex illumination condition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The present invention will be further explained below in conjunction with the accompanying drawings and specific embodiments. It should be understood that the following specific embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention.

[0070] Embodiments of the present invention provide a visual-touch fusion-based robotic arm autonomous moving and grasping method under complex lighting conditions. The overall system block diagram is as follows figure 1 shown, including:

[0071] The communication module is used to control the transmission of instructions, images and pose information, including the upper computer system installed on the remote console and the industrial computer system installed on the mobile platform. The two systems are connected through WiFi and use the SSH communication protocol. The upper computer sends control instructions to the industrial computer, and the industrial computer...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mechanical arm autonomous moving grabbing method based on visual-tactile fusion under a complex illumination condition. The method mainly comprises approaching control over a target position and feedback control over environment information. According to the method, under the complex illumination condition, weighted fusion is conducted on visible light and depth images of a pre-selected area, recognition and localization of a target object are completed based on a deep neural network, and a mobile mechanical arm is driven to continuously approach the target object; in addition, the pose of the mechanical arm is adjusted according to contact force information of a sensor module, the external environment and the target object; and meanwhile, visual information and tactile information of the target object are fused, and the optimal grabbing pose and the appropriate grabbing force of the target object are selected. By means of the method, the object localization precision and the grabbing accuracy are improved, collision damage and instability of the mechanical arm are effectively prevented, and harmful deformation of the grabbed object is reduced.

Description

technical field [0001] The invention belongs to the technical field of robot control, and in particular relates to a method for autonomously moving and grabbing a mechanical arm based on visual-touch fusion under complex lighting conditions. Background technique [0002] In recent years, with the rapid development of sensor technology, navigation technology and deep learning technology, robot technology has also made great progress. Among them, mobile robotic arm technology has gradually been widely used in automatic inspection, agricultural picking, storage sorting and other fields. The mobile robotic arm has autonomous navigation and autonomous operation capabilities. Compared with traditional industrial robotic arms, it has higher flexibility and mobility, and can replace humans in complex environments to complete certain tasks autonomously. [0003] In related technologies, the difficulty often lies in the identification and positioning of target objects; in recent years...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/1664B25J9/1674
Inventor 宋爱国魏林琥秦超龙赵玉
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products