Supercharge Your Innovation With Domain-Expert AI Agents!

Target attitude deep learning tracking acquisition method, learning system and storage medium

A technology of target posture and deep learning, applied in the field of robotics, can solve the problems of large amount of calculation, low degree of intelligence, and low degree of motion restoration, and achieve high degree of imitation restoration, high degree of intelligence, and reduced demand Effect

Active Publication Date: 2021-04-16
SHENZHEN YUEJIANG TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a deep learning tracking acquisition method based on target posture, a learning system and a storage medium, aiming to solve the problem of low degree of motion restoration, large amount of calculation for taking points, and manual labor when robots in the prior art simulate human teaching actions. The problem of more participation and less intelligence

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target attitude deep learning tracking acquisition method, learning system and storage medium
  • Target attitude deep learning tracking acquisition method, learning system and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0011] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0012] In describing the present invention, it should be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", The orientation or positional relationship indicated by "horizontal", "top", "bottom", "inner", "outer", etc. are based on the orientation or positional relationship shown in the drawings, and are only for the convenience of describing the present invention and simplifying the description, rather than Nothing indicating or implying that a referenced device or element must have a particular orientation, be constructed, and operate in a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to the technical field of robots, and discloses a deep learning and tracking collection method based on a target posture, a learning system and a storage medium for controlling robot learning and teaching actions, wherein the deep learning tracking collection method based on a target posture includes the following steps: selecting the A plurality of reference points of the teaching action, respectively tracking the movement of each of the reference points and recording the teaching tracking data; analyzing each of the teaching tracking data, and fitting at least two functions through the relationship between movement and time: used to describe an attitude function of the change of the target attitude over time, a displacement function used to describe the change of the target position over time; a control program is generated so that the robot can realize the teaching according to the attitude function and the displacement function action. The invention generates a driver program by collecting the teaching action of the target, reduces the degree of demand for manual participation, and has the advantages of high intelligence, high degree of imitation and reduction, and the like.

Description

technical field [0001] The present invention relates to the technical field of robots, and in particular to a tracking acquisition method, a learning system and a storage medium based on deep learning of target posture. Background technique [0002] A robot (Robot) is a high-tech product with a preset program or principled program inside. After receiving a signal or instruction, it can judge and take actions to a certain extent, such as moving, picking up, and swinging limbs. The task of the robot is mainly to assist or even replace the work of humans in some occasions. The actions and information judgments involved in the actual work scene are often very complicated, and it is difficult to record them all in the robot in the form of a program in advance. Knowledge, self-learning to improve adaptability and intelligence level, that is, robot learning, has become a very hot research focus in the robot industry. [0003] In the prior art, the process of the robot simulating t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06K9/00B25J9/16G06N20/00
CPCG06T7/246B25J9/163B25J9/1656G06T2207/20081G06V40/28
Inventor 刘培超刘主福郎需林
Owner SHENZHEN YUEJIANG TECH CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More