Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Efficient robotic control based on input from remote client device

A remote client and robot technology, applied in the field of efficient robot control based on input from remote client devices, can solve problems such as restricting robots to operate more efficiently, and achieve the effect of slowing down delays and saving network resources

Pending Publication Date: 2022-03-18
GOOGLE LLC
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This limits the ability of robotic manipulation to operate more efficiently

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Efficient robotic control based on input from remote client device
  • Efficient robotic control based on input from remote client device
  • Efficient robotic control based on input from remote client device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Figure 1A An example environment is shown in which implementations described herein may be practiced. Figure 1A A first robot 170A and associated robot vision components 174A, a second robot 170B and associated robot vision components 174B, and additional vision components 194 are included. Additional vision components 194 can be, for example, monocular vision cameras (e.g., generate 2D RGB images), stereo vision cameras (e.g., generate 2.5D RGB images), laser scanners (e.g., generate 2.5D "point clouds"), and can Operably connected to one or more systems disclosed herein (eg, system 110). Optionally, multiple additional vision components may be provided and the vision data from each vision component utilized as described herein. Robot vision components 174A and 174B may be, for example, monocular vision cameras, stereo vision cameras, laser scanners, and / or other vision components, and vision data from them may be provided to and used by respective robots 170A and 17...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

User interface input from a remote client device is utilized when controlling a robot in an environment. Implementations involve generating training instances based on object manipulation parameters defined by instances input by a user interface, and training a machine learning model to predict the object manipulation parameters. Those implementations may then utilize a trained machine learning model to reduce the number of instances requesting input from the remote client device when performing a given group of robotic operations, and / or to reduce the extent of input from the remote client device when performing a given group of robotic operations. Implementations additionally or alternatively relate to reducing idle time of a robot by utilizing visual data captured by an object to be manipulated by the robot before the object is transmitted to a robot workspace in which the robot can reach and manipulate the object.

Description

Background technique [0001] In industrial or commercial settings, robots are often preprogrammed to perform specific tasks repeatedly. For example, a robot may be pre-programmed to repeatedly apply fasteners to specific assembled components in an assembly line. Also, for example, a robot may be pre-programmed to repeatedly grasp and move a particular assembly part from a fixed first location to a fixed second location. When grasping an object, the robot can use grasping end effectors such as "impact" end effectors (e.g., using "claws" or other fingers to apply force to areas of the object), pins, needles, etc. to physically penetrate the object), a "retracting" end effector (e.g., using suction or vacuum to pick up the object), and / or one or more "contacting" end effectors (e.g., using surface tension, freezing, or adhesive to pick up objects). [0002] Such an approach works well in environments where constrained actions are repeatedly performed on constrained sets of comp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/163B25J9/1689B25J9/1697B25J9/1612G05B2219/36422G05B2219/39536
Inventor 约翰尼·李斯特凡·韦尔克
Owner GOOGLE LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products