Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target posture-based deep learning tracking acquisition method, learning system and storage medium

A target posture and deep learning technology, applied in the field of robotics, can solve problems such as low intelligence, low motion restoration, and large amount of calculation for point acquisition, and achieve high intelligence, reduced demand, and high imitation restoration. Effect

Active Publication Date: 2019-04-26
SHENZHEN YUEJIANG TECH CO LTD
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a deep learning tracking acquisition method based on target posture, a learning system and a storage medium, aiming to solve the problem of low degree of motion restoration, large amount of calculation for taking points, and manual labor when robots in the prior art simulate human teaching actions. The problem of more participation and less intelligence

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target posture-based deep learning tracking acquisition method, learning system and storage medium
  • Target posture-based deep learning tracking acquisition method, learning system and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0011] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0012] In describing the present invention, it should be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", The orientation or positional relationship indicated by "horizontal", "top", "bottom", "inner", "outer", etc. are based on the orientation or positional relationship shown in the drawings, and are only for the convenience of describing the present invention and simplifying the description, rather than Nothing indicating or implying that a referenced device or element must have a particular orientation, be constructed, and operate in a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of robots. The invention discloses a tracking and collecting method based on target posture deep learning. The learning system and the storage medium are used for controlling a robot to learn teaching actions, and the deep learning tracking and collecting method based on the target posture comprises the following steps that a plurality of reference points of the teaching actions are selected, the movement of each reference point is tracked, and teaching tracking data is recorded; Analyzing each teaching tracking data, and fitting into at least two functions through a movement and time relationship: an attitude function used for describing the change of the target attitude along with time and a displacement function used for describing the changeof the target position along with time; And generating a control program to enable the robot to realize the teaching action according to the attitude function and the displacement function. The driving program is generated by collecting the teaching action of the target, the requirement degree of manual participation is reduced, and the method has the advantages of being high in intelligent degree, high in imitation reduction degree and the like.

Description

technical field [0001] The present invention relates to the technical field of robots, and in particular to a tracking acquisition method, a learning system and a storage medium based on deep learning of target posture. Background technique [0002] A robot (Robot) is a high-tech product with a preset program or principled program inside. After receiving a signal or instruction, it can judge and take actions to a certain extent, such as moving, picking up, and swinging limbs. The task of the robot is mainly to assist or even replace the work of humans in some occasions. The actions and information judgments involved in the actual work scene are often very complicated, and it is difficult to record them all in the robot in the form of a program in advance. Knowledge, self-learning to improve adaptability and intelligence level, that is, robot learning, has become a very hot research focus in the robot industry. [0003] In the prior art, the process of the robot simulating t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06K9/00B25J9/16G06N20/00
CPCG06T7/246B25J9/163B25J9/1656G06T2207/20081G06V40/28
Inventor 刘培超刘主福郎需林
Owner SHENZHEN YUEJIANG TECH CO LTD
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More