Virtual-real fusion man-machine collaborative simulation method and system

A technology of human-machine collaboration and virtual-real integration, applied in the field of virtual-real integration of human-machine collaborative simulation, it can solve the problems of uncertainty, danger, large number of establishments, and high cost, and achieve the effects of ensuring human safety, more simulation, and cost saving.

Active Publication Date: 2020-07-24
NANJING INST OF TECH
View PDF15 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are two main simulation methods in the research of human-computer collaboration. One is to use a virtual simulation environment for human-computer collaborative simulation, that is, to establish a three-dimensional model of the robot and a three-dimensional model of the human body in the computing environment, and to carry out the simulation by driving the model movement. Research on human-computer interaction and human-computer collaboration, but this method requires the establishment of a large number of models, and the mod

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual-real fusion man-machine collaborative simulation method and system
  • Virtual-real fusion man-machine collaborative simulation method and system
  • Virtual-real fusion man-machine collaborative simulation method and system

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment 1

[0061] to combine figure 1 , the present invention proposes a virtual-real fusion human-machine collaborative simulation method, the simulation method comprising:

[0062] S1, build a virtual robot model, drive the movement of the virtual robot model, generate an augmented reality scene, and send the generated augmented reality scene to a relevant visual device.

[0063] S2, collecting and generating a human body three-dimensional pose sequence.

[0064] S3. Receive the user's hand position information and corresponding force application data returned by the data glove matched with the vision device.

[0065] S4, according to the 3D pose sequence of the human body and the returned result of the data glove, calculate the 3D pose sequence of the human body, arm and hand and the corresponding force information.

[0066] S5, combined with the 3D pose sequence of the human body, arm and hand and the corresponding force information, and the motion simulation results of the virtual...

specific Embodiment 2

[0104] combine figure 2 , based on the foregoing method, the present invention also mentions a virtual-real fusion human-machine collaborative simulation system, which includes a human-machine three-dimensional posture acquisition device, a visual device (such as augmented reality glasses), a data glove, and a graphics workstation.

[0105] The human body three-dimensional posture collection device is used to collect and generate a human body three-dimensional posture sequence, and send it to a graphics workstation.

[0106] The graphics workstation is used to build a virtual robot model, drive the virtual robot model to move, generate an augmented reality scene, and send the generated augmented reality scene to a relevant visual device.

[0107] The vision device and the data glove are worn on the user and connected to the graphics workstation, the vision device is used to display the augmented reality scene including the virtual robot model sent by the graphics workstation ...

specific Embodiment 3

[0128] The present invention can also simulate the interactive behavior between the virtual human and the physical robot, and change the collection of the three-dimensional pose information and force information of the human into the collection of the three-dimensional pose information and force information of the physical robot, combined with the movement of the created virtual human Model, using the similar method mentioned above to realize the collaborative simulation process of virtual human-physical robot. Among them, collecting the 3D pose information and force information of the physical robot can be realized by the aforementioned methods, or can be obtained by calculating the data of many sensors and controllers installed on the physical robot, and the motion model of the created virtual human needs to be endowed with physical attributes. and motion attributes to create a motion model similar to the aforementioned virtual robot model.

[0129] Aspects of the invention ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual-real fusion man-machine collaborative simulation method. The virtual-real fusion man-machine collaborative simulation method comprises the following steps: calculating a three-dimensional pose sequence of a human body, arms and hands and corresponding force application information according to three-dimensional pose information of the human body and a return result of a data glove; and combining the three-dimensional pose sequence of the human body, the arms and the hands, the corresponding force application information and a virtual robot model motion simulation result, obtaining position interaction information and force interaction information between a virtual robot and the human body three-dimensional posture sequence through detection on the basis ofa collision detection algorithm and a physical simulation algorithm, and carrying out simulation analysis on a man-machine cooperation process. The virtual-real fusion man-machine collaborative simulation method has the advantages that verification and experiments for various control algorithms in man-machine cooperation research can be carried out, and interaction, collision, force, action coordination and other content between people and the robot are experimented; the method is closer to a real scene, can truly reflect the human motion, avoids building of a complex model, guarantees the safety of the human body, enables a person to truly feel a cooperation process, and is more accurate in simulation.

Description

technical field [0001] The present invention relates to the technical fields of augmented reality and robot technology, in particular to a virtual-real fusion human-machine cooperative simulation method and system. Background technique [0002] Robots, especially industrial robots, are important tools in the manufacturing industry. Considering safety issues, robots used to be surrounded and worked alone. It is no longer possible to rely on robots to complete complex and detailed tasks. Robots must cooperate with humans to complete more complex production requirements. Therefore, in recent years, human-machine collaborative operations have become a development trend in robot applications. [0003] In the process of man-machine collaboration technology research and experiment, simulation technology is one of the important research means. At present, there are two main simulation methods in the research of human-computer collaboration. One is to use a virtual simulation enviro...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G05B17/02
CPCG05B17/02
Inventor 高海涛朱松青关鸿耀韩亚丽许有熊黄树新
Owner NANJING INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products