Viewpoint invariant visual servoing of robot end effector using recurrent neural network

A technology of end effector and neural network model, applied in the direction of biological neural network model, probabilistic network, neural architecture, etc., can solve problems such as failure, time-consuming, robot wear and tear, and achieve the effect of improving robustness and/or accuracy

Active Publication Date: 2020-02-07
GOOGLE LLC
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although such machine learning models are suitable for robots that capture images from the same or similar viewpoints, they may be inaccurate and / or fail when used in robots that capture images from different viewpoints
[0004] As another example of a shortcoming, various approaches rely heavily or exclusively on training examples generated based on data from real-world physical robots, which requires extensive use of physical robots when attempting robotic grasping or other manipulations
This can be time-consuming (e.g. it takes a lot of time to actually try a lot of grasping), can be resource-intensive (e.g. the electricity required to operate the robot), can cause wear and tear on the robot being used, and / or can require a large human intervention (e.g., placing objects to grasp, correcting error conditions)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Viewpoint invariant visual servoing of robot end effector using recurrent neural network
  • Viewpoint invariant visual servoing of robot end effector using recurrent neural network
  • Viewpoint invariant visual servoing of robot end effector using recurrent neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The implementation described herein trains and utilizes a recurrent neural network model that can be used at each time step to: process a query image of a target object, an image of the current scene including the target object and the robot's end effector, and a previous motion prediction; and generating a predicted motion based on the processing, the predicted motion indicating a prediction of how to control the end effector to move the end to the target object. A recurrent neural network model can be view-invariant in that it can be used across a variety of robots with vision components of various viewpoints, and / or can be used on a single robot even if the viewpoint of the robot's vision components varies dramatically. Furthermore, the recurrent neural network model can be trained based on a large amount of simulated data based on a simulator performing a simulation episode in view of the recurrent neural network model. One or more portions of the recurrent neural n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Training and / or using a recurrent neural network model for visual servoing of an end effector of a robot. In visual servoing, the model can be utilized to generate, at each of a plurality of time steps, an action prediction that represents a prediction of how the end effector should be moved to cause the end effector to move toward a target object. The model can be viewpoint invariant in that it can be utilized across a variety of robots having vision components at a variety of viewpoints and / or can be utilized for a single robot even when a viewpoint, of a vision component of the robot, is drastically altered. Moreover, the model can be trained based on a large quantity of simulated data that is based on simulator(s) performing simulated episode(s) in view of the model. One or more portions of the model can be further trained based on a relatively smaller quantity of real training data.

Description

Background technique [0001] Many robots are configured to utilize one or more end effectors to perform one or more robotic tasks, such as grasping and / or other manipulation tasks. For example, a robot may utilize a grasping end effector, such as an "impact" gripper or an "invasive" gripper (e.g., using pins, needles, etc. to physically penetrate an object) to pick from a first location object, moves the object to a second location, and drops the object at the second location. Some other examples of robotic end effectors that can grasp objects include "astrictive" end effectors (e.g., using suction or vacuum to pick up objects) and "contigutive" end effectors (e.g., using surface tension, freezing or adhesives to pick up objects). [0002] Various machine learning-based approaches have been proposed for robotic manipulation tasks such as grasping. Some of these methods train a machine learning model (e.g., a feed-forward deep neural network) to generate one or more predictio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16G05B13/02G06N3/08
CPCB25J9/1697G05B13/027G05B2219/33056G05B2219/39391G05B2219/40499G05B2219/42152G06N3/008G06N3/084G06N7/01G06N3/044G06N3/045B25J9/163
Inventor A.托谢夫F.萨德吉S.莱维恩
Owner GOOGLE LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products