Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for controlling robot by body sense, robot and control device thereof

A somatosensory control and robot technology, applied in the field of visual recognition, can solve problems such as the inability to guarantee real-time performance, the inability to timely and accurately control the posture of the robot to simulate the operator, and the inability to identify the operator posture in a timely and accurate manner.

Inactive Publication Date: 2019-01-11
南昌与德通讯技术有限公司
View PDF11 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since prior art 2 calculates globally based on all human body joints by solving the integer linear programming problem, and the integer linear programming problem is an NP-Hard problem, the average processing time for solving this type of problem is about several minutes to several hours, therefore, real-time performance cannot be guaranteed
[0005] To sum up, when using the existing human body detection method to control the robot to follow the human body's movement, it cannot timely and accurately recognize the gesture of the operator when there are many people, so it cannot be timely and accurately controlled. The robot mimics the pose of the operator

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for controlling robot by body sense, robot and control device thereof
  • Method for controlling robot by body sense, robot and control device thereof
  • Method for controlling robot by body sense, robot and control device thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention more clear, various implementation modes of the present invention will be described in detail below in conjunction with the accompanying drawings. However, those of ordinary skill in the art can understand that, in each implementation manner of the present invention, many technical details are provided for readers to better understand the present application. However, even without these technical details and various changes and modifications based on the following implementation modes, the technical solution claimed in this application can also be realized.

[0021] The first embodiment of the present invention relates to a method of somatosensory control of a robot. The core of this embodiment is to provide a method of somatosensory control of a robot, including: acquiring a color image and a depth image including the operator, inputting the color image The conv...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention relates to the technical field of visual identification, and discloses a method for controlling a robot by body sense, a robot and a control device thereof. The methodfor controlling the robot by the body sense includes the steps that a color image and a depth image containing an operator are obtained; the color image is input into a convolutional neural network toobtain two-dimensional coordinates of articulation points of the operator on the color image and the connected relation between the articulation points; the depth image and the color image are registered to obtain three-dimensional coordinates of the two-dimensional coordinates of the articulation points of the operator mapping into the depth image; the angles of joints of the operator are calculated according to the three-dimensional coordinates and the connected relation between the articulation points; and according to the three-dimensional coordinates and the angles of the joints, the robot is controlled to act following the actions of the operator. According to the method for controlling the robot by the body sense, the robot and the control device thereof, the robot can identify thepostures of the operator timely and accurately when multiple people are involved, and can simulate the postures of the operator timely and accurately.

Description

technical field [0001] Embodiments of the present invention relate to the field of visual recognition technology, and in particular to a method for controlling a robot by somatosensory control, a robot and a control device. Background technique [0002] At present, the application of visual monitoring and recognition is becoming more and more popular. The extraction of human skeleton in the fields of security, transportation and entertainment is the basis of many behavior recognition and somatosensory interactions. After the successful application of the deep network, this technology has been greatly improved on Microsoft's COCO dataset. After the large-scale application of deep learning, common methods for identifying human bodies are as follows: [0003] Existing technology 1: Use a pedestrian detection frame to select a single pedestrian in the image, and then judge the pose of the single person in the frame. However, in prior art 1, when there are too many people in th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16G06F3/01G06K9/00
CPCG06F3/011B25J9/16G06V40/10
Inventor 刘艺成周宸李元媛费小平郭汉超
Owner 南昌与德通讯技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products