Robot motion posture visual estimation method

A technology for robot motion and pose estimation, which is applied in instrumentation, calculation, image data processing, etc., can solve the problem that robot estimation is susceptible to interference, achieve the effect of eliminating cumulative errors and reducing hardware costs

Pending Publication Date: 2020-11-17
NANJING NORMAL UNIVERSITY
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The present invention provides a visual estimation method for robot motion posture, which greatly reduces the cost of sensor hardware and eliminates cumulative errors, and solves the problem that the existing robot mot...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot motion posture visual estimation method
  • Robot motion posture visual estimation method
  • Robot motion posture visual estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to illustrate the technical solution of the present invention more clearly, the technical solution of the present invention is described in further detail below in conjunction with the accompanying drawings:

[0051] Such as figure 1 Described; A kind of robot movement posture visual estimation method, comprises the following steps:

[0052] Step (1.1), collecting continuous video images in a dynamic environment, selecting two of the collected images, detecting key points, calculating key point feature values, detecting and saving feature point positions;

[0053] Step (1.2), segmenting the original image, obtaining and saving the position of the pixel area where the dynamic object is located in the image;

[0054] Step (1.3), comparing the image for storing the feature point information with the image for saving the segmentation result, removing the feature points distributed in the pixel area where the dynamic object is located in the image for storing the f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot motion posture visual estimation method. The method relates to the technical field of robot and machine vision, and comprises the steps of collecting continuous video images in a dynamic environment, selecting two of the collected images, detecting key points, calculating characteristic values of the key points, and detecting and storing the positions of the characteristic points; segmenting an original image, and obtaining and storing the position of a pixel region where a dynamic object is located in the image; comparing the image storing the feature point information with the image storing a segmentation result, and removing feature points distributed at the position of the pixel area where the dynamic object is located in the image storing the feature point information; and performing feature matching between the adjacent images by using the residual feature points after elimination and optimization, calculating pose motion of a robot between the adjacent images, and outputting robot motion pose estimation. The method has the advantages of being high in pose estimation accuracy, small in calculated amount and high in background environment interference resistance.

Description

technical field [0001] The invention relates to the technical field of robots and machine vision, in particular to a method for visually estimating robot motion postures. Background technique [0002] Mobile robots need to estimate their own motion posture in the working environment, which is especially important for mobile robots to achieve path planning, obstacle avoidance and other functions. Although the robot can obtain motion attitude estimation using high-precision inertial sensors, the cost is relatively high. At present, there are many interference factors in the estimation of robot motion attitude, and it is less involved in the estimation of robot motion attitude using only visual sensors. [0003] The present invention provides a method for visually estimating the motion posture of a robot, which greatly reduces the cost of sensor hardware and eliminates cumulative errors, and solves the problem that existing robot motion posture methods rarely use vision for es...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246G06T7/215G06T7/73
CPCG06T2207/10016G06T7/215G06T7/248G06T7/74
Inventor 吴俊谢非吴奕之梅一剑吴启宇卢毅曹湘玉何逸周钟文叶欣雨
Owner NANJING NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products