Method for controlling robot based on visual sense

A control method and robot technology, applied in the direction of program control of manipulators, manipulators, manufacturing tools, etc., can solve problems that cannot meet the requirements of manipulator control

Inactive Publication Date: 2012-02-15
SOUTH CHINA UNIV OF TECH
View PDF4 Cites 60 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The operation of the robotic arm requires a series of complex instructions, and a

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for controlling robot based on visual sense
  • Method for controlling robot based on visual sense
  • Method for controlling robot based on visual sense

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The present invention will be described in further detail below in conjunction with embodiment and accompanying drawing, but the embodiment of the present invention is not limited to this embodiment, figure 1 Shown is a frame model diagram.

[0070] According to the vision-based control robot method, it includes the following steps:

[0071] S1. Acquiring images of human hand gestures through a camera;

[0072] S2, extracting the feature points of the human hand in the gesture image;

[0073] S3. Perform three-dimensional reconstruction on the feature points to obtain the positional relationship of the hand feature points in three-dimensional space;

[0074] S4. Transform the coordinate points corresponding to the feature points of the human hand into the base coordinates of the robot;

[0075] S5. Using the pose relationship of the human hand in the robot base coordinate system to perform inverse calculation to obtain the joint angle of the robot;

[0076] S6. Usin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method for controlling a robot based on visual sense. The method comprises the following steps of: (1) acquiring a gesture image of a human hand by using a camera; (2) extracting characteristic points of the human hand from the gesture image; (3) performing three-dimensional reconstruction on the characteristic points to obtain a position relation of the characteristic points of the human hand in a three-dimensional space; (4) converting coordinate points corresponding to the characteristic points of the human hand to be under a base coordinate of the robot; (5) performing inverse-solution calculation by using the position relation of the human hand under a base coordinate system of the robot to obtain a joint angle of the robot; and (6) driving the robot to run by using the calculated joint angle. The method has the advantages that: 1) the control is intuitive, and the holding gesture of the robot directly corresponds to the gesture of the human hand; 2) the control is flexible without contacting an onerous exchange tool; 3) an operator can be assisted to operate more accurately and safely by imitating the prior art; 4) the recovery is allowed to be interrupted or the operator is allowed to be replaced in midway; and 5) the operator does not need to walk in a wide range so that the operating pressure of the operator is reduced.

Description

technical field [0001] The invention belongs to the field of robot human-computer interaction, and in particular relates to a vision-based robot control method. Background technique [0002] With the increasing application of robots, especially when robots enter people's daily life, research on peer-to-peer interaction between humans and robots has been paid more and more attention. In peer-to-peer interactions, more and more people and robots What we need is a partnership, not a simple relationship between people and tools. [0003] There are some deficiencies in teleoperated robot systems and fully autonomous robot systems, which make human-robot collaborative systems have great potential. In this type of system, human and robot members cooperate with each other to complete the target task according to their own abilities. The advantage of this team mode is that the intelligence of human and robot can be integrated to effectively complete the target task. [0004] Resear...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): B25J9/16
Inventor 张平杜广龙
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products