Unlock instant, AI-driven research and patent intelligence for your innovation.

Visual movement control method for transforming simulation result to real world

A real-world, motion-controlled technique with applications in the field of visual control

Inactive Publication Date: 2017-12-08
SHENZHEN WEITESHI TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] To solve the problem of visual feedback control motion in a single or complex environment, the purpose of the present invention is to provide a visual motion control method that converts simulation results to the real world, and proposes a new method of converting simulation results to real world data. frame

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual movement control method for transforming simulation result to real world
  • Visual movement control method for transforming simulation result to real world
  • Visual movement control method for transforming simulation result to real world

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0041] figure 1 It is a system flowchart of a visual motion control method for converting simulation results to the real world in the present invention. Mainly including data generation; training methods.

[0042] Among them, data generation, under the condition of not depending on the real world data, uses the multi-stage task method to create an end-to-end controller for visual motion control scheme, specifically: 1) Generate several shortest operation paths during the simulation process; 2 ) use these path data to train the machine speed; 3) use the pipeline and instrument layout controller to match the machine speed in 2) to the machine torque; 4) use the domain stochastic method ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention presents a visual movement control method for transforming a simulation result to a real world. The main content includes data generation and training methods. The process of the visual movement control method includes steps of firstly, generating a series of linear paths by using a theory of inverse kinematics to train; sampling the image color, position and other information generated in the training process according to a certain distribution discipline; besides, adding artificial background noise (impurities) to approach data of the real world; finally, performing the control training of the visual movement by means of a convolution network and a long-short-time memory network. The visual movement control method overcomes the difficulty of collecting the real world data on a large scale, providing a path generation method based on a Cartesian space, while improving the accuracy of the visual movement control and the expansibility of its training scale.

Description

technical field [0001] The invention relates to the field of visual control, in particular to a visual motion control method for converting simulation results to the real world. Background technique [0002] Vision-based manipulator motion control is an important method of using visual information to implement feedback control of manipulator motion, covering machine vision, image processing, robot dynamics, control theory and other research fields. At the same time, due to the rise of deep learning, especially the convolutional neural network method in recent years, it has brought great convenience to extract features and analyze content. The traditional method of identifying image feature feedback and identifying content in the past is being replaced. However, since one of the foundations required for the learning of neural networks is massive training data, and the amount of such manipulator motion control data is extremely limited in reality, it is necessary to generate a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16
CPCB25J9/1671
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH