Check patentability & draft patents in minutes with Patsnap Eureka AI!

Affordance-aware, multi-resolution, free-form object manipulation planning

An object and planning technology, applied in character and pattern recognition, program control, instrumentation, etc., to solve problems such as increased reliability and credibility of robots, robot difficulty analysis, and interaction difficulties.

Pending Publication Date: 2021-12-28
INTEL CORP
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As the complexity of the job increases and the variability of the environment increases, robots can face increased difficulty in performing the job with reliability and confidence
Additionally, dynamic environments can present difficulties for robots to analyze
For example, objects may move and have irregular shapes, making interaction difficult

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Affordance-aware, multi-resolution, free-form object manipulation planning
  • Affordance-aware, multi-resolution, free-form object manipulation planning
  • Affordance-aware, multi-resolution, free-form object manipulation planning

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0169] Example 1 includes a computing system comprising one or more sensors for generating sensor data, including image data, a processor coupled to the one or more sensors, and a computer comprising a set of executable program instructions memory, executable program instructions that when executed by the processor cause the computing system to: generate a semantically annotated image based on the image data from the sensor data, wherein the semantically annotated image is used to identify a shape of an object and a semantic label of the object; A first set of motions is associated with the object; and generating a plan based on the intersection of the first set of motions and a second set of motions to satisfy commands from the user through actuation of one or more end effectors, wherein the second set of motions associated with the command.

example 2

[0170] Example 2 includes the computing system of Example 1, wherein the instructions, when executed, further cause the computing system to: apply a first label to a first subsection of an object, and apply a second label to a second subsection of an object, wherein the second label is different from the first label.

example 3

[0171] Example 3 includes the computing system of Example 1, wherein the instructions, when executed, further cause the computing system to: generate a surface patch from the semantically annotated image, the surface patch representing the object; reduce the resolution of the surface patch; resolution surface patches to generate plans.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the affordance-aware, multi-resolution, free-form object manipulation planning. Systems, apparatuses and methods may provide for controlling one or more end effectors by generating a semantic labelled image based on image data, wherein the semantic labelled image is to identify a shape of an object and a semantic label of the object, associating a first set of actions with the object, and generating a plan based on an intersection of the first set of actions and a second set of actions to satisfy a command from a user through actuation of one or more end effectors, wherein the second set of actions are to be associated with the command.

Description

technical field [0001] Embodiments generally relate to end effectors. In particular, embodiments relate to the control of end effectors of robots in dynamic environments. Background technique [0002] Robots are able to perform tasks autonomously to accomplish certain goals. For example, a human can instruct a robot to perform a job, and the robot can then perform the job without supervision. As the complexity of jobs increases and the variability of the environment increases, robots may face increasing difficulties in performing jobs with reliability and confidence. Furthermore, dynamic environments can present difficulties for robots to analyze. For example, objects may move and have irregular shapes, making interaction difficult. Contents of the invention [0003] An aspect of the present disclosure provides a computing system for enhanced object manipulation planning. The computing system includes: one or more sensors for generating sensor data, the sensor data in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16B25J13/08B25J15/08G06K9/00G06K9/34G06K9/62G06N3/04G06N3/08
CPCB25J9/1679B25J9/1661B25J9/1697B25J13/087B25J13/088B25J15/08G06N3/04G06N3/08G06F18/24B25J9/1612B25J19/023G05B2219/39484
Inventor 大卫·艾斯瑞尔·冈萨雷斯·阿吉雷加维尔·费利普利昂加维尔·塞巴斯蒂安·图雷克加维尔·佩雷斯-拉米雷斯伊格纳西奥·J·阿尔瓦雷斯
Owner INTEL CORP
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More