Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Reconstruction of human emulated robot working scene based on multiple information integration

A technology of multi-information fusion and operation scenarios, applied in the direction of manipulators, manufacturing tools, etc., can solve the problems of large delay, inability to truly reflect the operation situation of robots, limited viewing angle, etc., to achieve the effect of maintaining continuity

Inactive Publication Date: 2008-05-21
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are the following disadvantages: 1) It cannot provide 3D information, so the actual 3D position operation is difficult to realize; 2) The viewing angle is limited, and the camera installed on the robot's job site is fixed in position and cannot provide a comprehensive viewing angle; 3) Large time delay , the data volume of video image files transmitted in the network is relatively large, and in the case of limited network bandwidth, there is a large delay in the transmission process
But there are the following disadvantages: what such a system provides to the operator is the prediction and simulation of the robot and its job site, which cannot truly reflect the robot's job situation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reconstruction of human emulated robot working scene based on multiple information integration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The working process of the whole humanoid robot working scene is as follows:

[0027] In the first step, the robot starts running and the teleoperation control starts. Start the computer program to display the established scene model. Use the initialization data to determine the initial position of the robot model and its operation target model, and determine the initial angle between the connecting rods of the robot model. What is generated in this step is the initialization interface of the scene.

[0028] In the second step, the scene data processing module receives the operation commands issued by the teleoperator in real time, interprets and generates predicted trajectory data. Use the prediction data to drive each model in the virtual scene to form a three-dimensional virtual scene. The scene generated in this step can show the ideal motion picture of the robot executing the commands issued by the operator.

[0029] In the third step, the sensor of the robot i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an anthropomorphic mold robot operation scope, which is the real-time video image associates operator control command and feedback information, wherein the scope can display anthropomorphic robot mode of field work and operation environment mode; the scope can receive location information of anthropomorphic robot work field and sensing information in robot operation; using above information activation anthropomorphic robot and environmental mode motion can display real-time video image; the scope carries out artificial prediction according to operator sending order to create anthropomorphic robot operating data and data set of every mode in ideal condition. the data which using prediction simulation to create drives mode, when the real-time feedback information misses or can't be get.

Description

Technical field: [0001] The invention belongs to the field of robots and is mainly used for three-dimensional reconstruction of a humanoid robot operation scene. It is suitable for robot teleoperation control, and can display the three-dimensional images of the humanoid robot and the objects in the operation scene in real time, providing visual presence for the humanoid robot teleoperation. Background technique: [0002] A humanoid robot is a robot that has human appearance characteristics and can simulate basic human actions. Teleoperation is an important technology for robotic applications. Through the teleoperation platform, operators can monitor and control remote robots to complete various tasks, so that humanoid robots can replace humans to complete various tasks in some inaccessible, even some endangering human health or life safety environments. [0003] The image display of the working environment is a key technology of teleoperation control. At present, the imag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J19/00
Inventor 黄强张雷卢月品高峻峣李敏
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products