Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for man-machine cooperation sharing control remote welding

A shared control and human-machine collaboration technology, applied in welding equipment, auxiliary welding equipment, welding/cutting auxiliary equipment, etc., can solve the problem that direct control operations cannot be carried out continuously, independent tracking of weld seam welding cannot be realized, and the welding environment is complex, etc. problems, to achieve the effect of enhancing environmental adaptability, improving autonomy, and improving fault recovery capabilities

Inactive Publication Date: 2012-02-08
HARBIN INST OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that the direct control operation cannot be carried out continuously or even work under the condition of time delay in the current remote control welding, and the welding seam can be tracked autonomously when the welding environment is complex and the contour size of the weld bevel is irregular. The problem that cannot be realized, a method of human-machine cooperation sharing control remote welding is proposed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for man-machine cooperation sharing control remote welding
  • Method for man-machine cooperation sharing control remote welding
  • Method for man-machine cooperation sharing control remote welding

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0016] Specific implementation mode one: combine Figure 1-Figure 3 Describe the present embodiment, the method of the present embodiment is realized by the following steps: Step 1: the macro zoom camera 16 collects two-dimensional video images, and transmits the two-dimensional video images to the macro vision display 1 through the video line, and the operator 3 locally In the central monitoring human-machine interface 2 at the end, the field of view is adjusted through the focus control and the second controllable pan-tilt 15;

[0017] Step 2: The operator 3 operates the space mouse 4 to issue a control command, which is transmitted to the remote robot controller 8 through the industrial Ethernet 7 to control the movement of the remote robot 10, and the robot 10 guides the welding torch 17 to move to 30-40mm above the welding seam At the same time, the robot controller 8 transmits the pose matrix of the welding gun 17 to the central monitoring man-machine interface 2 of the ...

specific Embodiment approach 2

[0025] Specific implementation mode two: combination image 3Describe this embodiment, the degree of freedom weighted fusion algorithm of this embodiment is superimposed by the direct control command of the operator 3 and the visual sensing control command in the robot controller 8 according to the predetermined weight value, and the control commands of the two All degrees of freedom of the robot 10 are affected to jointly change the pose of the welding torch 17, and it is assumed that the driving matrix of the manual control command of the operator 3 is ΔT 1 6 , the weight value is K 1 (01 2 6 , the weight value is K 2 (02 T = K 1 ΔT 1 6 + K 2 ΔT 2 6 ( 0 K 1 1,0 ...

specific Embodiment approach 3

[0026] Specific implementation mode three: combination figure 2 Describe this embodiment, the degree of freedom segmentation algorithm of this embodiment is to divide the six degrees of freedom of the robot, direct control commands and visual sensing control commands respectively control different degrees of freedom, the laser stripes of the laser vision controller 9 hit The surface of the workpiece 14 is reflected on the laser vision sensor working head 18, and the groove information of the weld is extracted through image processing to obtain the position information (X, Y, Z) of the weld feature point, where X is the horizontal direction and Y is the speed , Z is the height, combined with the attitude (Rx, Ry, Rz) of the robot 10 at this time, wherein Rx is the travel angle, Ry is the working angle, and Rz is the spin angle, the drive matrix of the laser vision sensor control command is obtained, set The drive matrix is ​​ΔT 2 n , the weight value is K 2 (02 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for man-machine cooperation and shared control of remote control welding, which relates to a method for remote control welding. The purpose of the present invention is to solve the problem that in the case of time delay in the current remote control welding, the direct control operation cannot be carried out continuously, or even cannot work, and when the welding environment is complex and the weld groove outline size is irregular, the automatic tracking weld seam welding cannot problem of realization. Step 1: The macro zoom camera collects two-dimensional video images, and the operator adjusts the field of view in the central monitoring human-machine interface; Step 2: The robot guides the welding torch to move to the vicinity of the top of the weld; Step 3: The operator tracks the weld; Step 4: The workpiece Fix it on the working platform to form a remote welding environment; Step 5: The robot moves to the starting point of the weld; Step 6: The arc length distance between the welding torch and the workpiece is set through the central monitoring man-machine interface; Step 7: The operator Set the shared control algorithm through the central monitoring man-machine interface. The invention is used for remote welding.

Description

technical field [0001] The invention relates to a remote welding method. Background technique [0002] Welding applications are indispensable in the maintenance of nuclear power plant equipment, underwater construction of marine engineering, and future space station construction. The limitations of these extreme environments make it difficult for the operator to be present at the scene and directly operate the welding torch or equipment. The traditional welding method cannot be applied, and remote welding must be carried out. Remote welding refers to the remote monitoring and control of welding equipment and welding process by people in a safe environment away from the site according to various sensor information from the site, so as to complete the welding operation. Due to the adaptability of the robot to the environment, it is used as an actuator in research and practical applications. When performing remote welding, the operator usually operates the robot remotely and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B23K37/00
Inventor 李海超高洪明张广军陈洪堂吴林
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products