Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for collecting imitation learning data by using virtual reality technology

A virtual reality technology and data collection technology, applied in the field of imitation learning data collection using virtual reality technology, can solve the problems of difficulty in the demonstration stage, low efficiency, model performance dependence, etc., and achieve the effect of improving model training efficiency and facilitating collection.

Pending Publication Date: 2022-08-02
SHANDONG UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] But for an agent with complex behavior, it is difficult or even impossible to operate through the keyboard during the demonstration phase, and the quality of the demonstration is poor
Model performance is highly dependent on demonstration quality, which makes it infeasible to use imitation learning to train complex agents, which can only achieve the desired effect through a large amount of training time, which is inefficient

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for collecting imitation learning data by using virtual reality technology
  • Method for collecting imitation learning data by using virtual reality technology
  • Method for collecting imitation learning data by using virtual reality technology

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] refer to figure 1 , a method for imitation learning data collection using virtual reality technology,

[0035] Include the following steps:

[0036] Step 1: Acquire scene image data, and build a virtual scene in a 3D engine by imitating the real scene;

[0037] Step 2: Setting up at least one operating virtual model object as a proxy in the virtual scene;

[0038] Step 3: According to the specific goal to be achieved, use the Unity plug-in ML-Agents to write code to complete the status input, reward setting and action output of the intelligent agent;

[0039] Step 4: Perform reinforcement learning training: configure reinforcement learning training parameters, perform training, and check the effect;

[0040] Step 5: Perform imitation learning training: configure imitation learning training parameters, use virtual reality trackers to perform human demonstrations, and complete imitation learning data collection;

[0041] Step 6: Carry out imitation learning training o...

Embodiment 2

[0045] refer to figure 2 , On the basis of Embodiment 1, an example of building a tennis scene is adopted, which specifically includes the following steps:

[0046] Step 1: Obtain the tennis court scene image data, and build a tennis court virtual scene in a 3D engine by imitating the real scene;

[0047] Step 2: Set up two operating virtual model objects as proxies in the virtual tennis scene, and share the policy model parameters according to the proxy rules;

[0048] Specifically, when the proxy rules of multiple agents are the same, the same policy module is used, otherwise, different policy models are used.

[0049] It should be noted that the policy model adopts Proximal Policy Optimization (proximal policy optimization algorithm), where:

[0050] The ratio of the action probability under the current strategy divided by the action probability of the previous strategy Constrain the objective function to ensure that large policy updates do not occur.

[0051] Cropped...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for collecting imitation learning data by utilizing a virtual reality technology, which belongs to the technical field of virtual reality, and comprises the following steps of: 1, acquiring scene image data, imitating a real scene, and constructing a virtual scene in a three-dimensional engine; 2, at least one operation virtual model object is set in the virtual scene in an equal ratio mode to serve as an agent. Compared with a traditional method that a keyboard is used for manual demonstration, the simulation learning and the virtual reality technology are combined, a feasible scheme is provided for training an agent with complexity, in addition, collection of simulation learning data is facilitated, and the model training efficiency is improved; simulation learning data collection can be realized by using virtual reality, and the model training efficiency is improved.

Description

technical field [0001] The invention relates to the technical field of virtual reality, in particular to a method for collecting imitation learning data by using the virtual reality technology. Background technique [0002] In recent years, with the continuous breakthrough of artificial intelligence-related technologies and the continuous maturity of related algorithms, AI intelligent agents have gradually penetrated into various fields and showed good application effects. Unity Machine Learning Agents (ML-Agents) is an open source Unity plugin that allows users to train intelligent agents in game environments and simulation environments, using reinforcement learning, imitation learning, neuroevolution or other machine learning methods, through simple and easy-to-use The Python API to control, train the agent. [0003] Reinforcement learning maximizes benefits by interacting with the environment, and can train intelligent agents far beyond humans, but the training time is o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N20/00G06T19/00
CPCG06N20/00G06T19/006
Inventor 王春鹏石翔慧盖新宇张岩
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products