Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robotic arm grasping method based on semantic laser interaction

A robotic arm and laser technology, applied in manipulators, program-controlled manipulators, image analysis, etc., can solve the problems of unfriendly human-computer interaction, inconvenient operation, poor user experience, etc., achieving novel interaction methods and improving ease of use. , the effect of convenient operation

Active Publication Date: 2022-06-17
HARBIN INST OF TECH AT WEIHAI +1
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention aims to solve the technical problems of inconvenient operation, heavy burden, poor precision and low efficiency; the human-computer interaction is not friendly enough and the user experience is poor when the existing arm-loading wheelchair-type robot for helping the elderly and the disabled is operated by a handle, and provides A robotic arm grasping method based on semantic laser interaction with more convenient operation, light burden, high precision and high efficiency; more friendly human-computer interaction and better user experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robotic arm grasping method based on semantic laser interaction
  • Robotic arm grasping method based on semantic laser interaction
  • Robotic arm grasping method based on semantic laser interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0125] refer to Figure 4 , place multiple objects (such as: cups, spoons, water bottles, etc.) on the table, the RGB-D camera 30 shoots the objects on the table in real time, the elderly sit on the electric wheelchair 10, hold the laser pointer 60, press the laser The switch of the pen 60 emits laser light and irradiates the target object (such as a cup), and a laser spot is formed on the target object. In this embodiment, the laser spot is mainly used as the human-computer interaction medium, and the system recognizes the target object according to the laser spot on the object, and realizes that the robotic arm 20 automatically grabs the target object (that is, grabs the object on the table and moves it to a certain fixed space) position) to realize the function of laser point picking objects, such as Figure 5 The specific control method is as follows:

[0126] Step S101 , the RGB-D camera 30 is used to photograph the area where the object on the table is located, to acqu...

Embodiment 2

[0153] refer to Figure 4 , place multiple objects (such as: cups, spoons, water bottles, etc.) on the table, the RGB-D camera 30 shoots the objects on the table in real time, the elderly sit on the electric wheelchair 10, hold the laser pointer 60, press the laser The switch of the pen 60 emits laser light and irradiates it on the water bottle of the first target object, and a laser spot will be formed on the water bottle. After a period of time t1, the laser output by operating the laser pen 60 is irradiated on a certain point on the table of the second target object and is irradiated. Stay for a period of time t2. This embodiment mainly recognizes two target objects according to two laser spots appearing on different target objects, so that the robotic arm 20 can automatically grab the first target object and move it to a certain fixed position on the second target object (that is, the object moves to the desktop another fixed spatial position), such as Figure 7 The spec...

Embodiment 3

[0200] refer to Figure 4 , place multiple objects (such as: cups, spoons, water bottles, etc.) on the table, the RGB-D camera 30 shoots the objects on the table in real time, the elderly sit on the electric wheelchair 10, hold the laser pointer 60, press the laser The switch of the pen 60 emits laser light and irradiates it on the water bottle of the first target object, and a laser spot will be formed on the water bottle. After staying for a period of time t1, the laser output from the operating laser pointer 60 is irradiated on a certain point on the cup of the second target object and stays for a period of time t2. This embodiment mainly identifies two target objects according to the laser spots on different target objects, and realizes mechanical The arm 20 automatically grabs the first target object water bottle and moves it to a certain fixed position above the second target object cup and pours water into the cup. like Figure 8 The specific control method is as foll...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a robotic arm grabbing method based on semantic laser interaction, which solves the problems of inconvenient operation, heavy burden, poor precision, and low efficiency in the existing arm-carrying wheelchair-type robot for helping the elderly and the disabled, which are operated by handles; The computer interaction is not friendly enough and the technical problem of poor user experience, which includes the following steps: acquire the image of the object; identify the laser spot in the image; identify the target object according to the laser spot; determine the grasping pose of the target object; The laser spot disappears, and the robot arm moves according to the grasping pose to grab the target object; the robot arm moves the target object. The invention is widely used for controlling the mechanical arm to grab objects or operating the arm-carrying wheelchair type robot for helping the elderly and the disabled.

Description

technical field [0001] The invention relates to the technical field of robots for helping the elderly and the disabled, in particular to a robotic arm grasping method based on semantic laser interaction. Background technique [0002] As we all know, China is in the stage of rapid development of an aging society, and the number of elderly people and disabled people who need nursing care in the society is increasing. [0003] With reference to the Chinese invention patent application with the application publication number CN109048918A, the Chinese invention patent application with the application publication number CN109262632A, and the Chinese invention patent application with the application publication number CN107595505A, the arm-carrying wheelchair-type assistance robot for the elderly and the disabled has both carrying, grasping, and moving Object function, the human-computer interaction of this type of robot is mainly operated by the handle, but the handle operation me...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16G06T7/73
CPCB25J9/1697B25J9/1679G06T7/73G06T2207/10024G06T2207/20081G06T2207/20084
Inventor 刘亚欣钟鸣王思瑶姚玉峰
Owner HARBIN INST OF TECH AT WEIHAI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products