Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot motion estimation method based on dense optical flow

A technology of robot movement and dense optical flow, applied in the field of robot vision, can solve the problems of low precision, limited camera function, poor robustness, etc., and achieve the effect of improving robustness, improving precision, and simplifying the conversion relationship

Active Publication Date: 2014-04-30
BEIJING UNIV OF TECH
View PDF2 Cites 43 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problems in the prior art that the motion estimation accuracy is not high, the robustness is poor in the case of uneven illumination and motion blur, and the role of the camera is limited, the present invention provides a robot motion speed estimation based on dense optical flow method, introducing a dense optical flow algorithm based on polynomial decomposition, which improves the accuracy of motion estimation; applying the RANSAC algorithm to the optical flow purification process, which improves the robustness of the algorithm; placing the camera lens obliquely downward, so that the robot can Real-time detection of the situation ahead provides convenience for completing other tasks such as obstacle avoidance and path planning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot motion estimation method based on dense optical flow
  • Robot motion estimation method based on dense optical flow
  • Robot motion estimation method based on dense optical flow

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0030] The flow chart of the robot motion speed measurement method based on dense optical flow is as follows: figure 1 As shown, it specifically includes the following steps.

[0031] Step 1, calibrate the internal and external parameters of the camera.

[0032] (1) Print a target template with a size of A2, and the interval of feature points on the target is 30mm.

[0033] (2) Shoot the target from multiple angles. When shooting, try to make the target occupy the screen as much as possible, and ensure that every corner of the target is in the screen, and shoot a total of 9 target images.

[0034] (3) Detect the feature points in the image, that is, every black intersection point of the target.

[0035] (4) Establish camera coordinate system, robot coordinate system and world coordinate system, as attached figure 2 As shown, the camera le...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a robot motion estimation method based on dense optical flow. The method comprises the steps of calibrating inner parameters and outer parameters of a camera, collecting image sequences and conducting distortion correction, solving the dense optical flow through the polynomial expansion method, purifying light flow, conducting coordinate transformation between image coordinates and robot coordinates, and estimating the motion speed of a robot. The robot motion estimation method based on the dense optical flow solves the problems that in the prior art, motion estimation precision is not high, robustness is poor on the conditions that illumination is uneven and motion is vague, and the function of the camera is restricted. According to the robot motion estimation method based on dense optical flow, the dense optical flow algorithm based on polynomial reduction is introduced, so that the precision of motion estimation is improved. The RANSAC algorithm is applied to the optical flow purification process, so that the robustness of the algorithm is improved. The monocular camera is used, and a lens is arranged downwards in an inclined mode, so that a robot is made to detect the condition ahead in real time, and convenience is brought to completion of tasks such as avoiding barriers and planning paths.

Description

technical field [0001] The invention belongs to the field of robot vision and relates to a method for estimating the moving speed of a robot by using visual information. Background technique [0002] The measurement of motion state is very important in the self-localization, path planning, autonomous navigation and other application fields of mobile robots. In recent years, vision-based methods have become a research hotspot for mobile robot motion estimation. Vision odometry is a method for estimating the relative motion of a robot using a sequence of images captured by a camera. Compared with the traditional robot motion estimation method, the vision-based motion estimation method has the advantages of simple structure and high precision. Moreover, for omnidirectional mobile robots, traditional odometry-based dead reckoning is difficult to achieve, and the advantages of vision-based methods are more obvious. [0003] The patent application No. 201110449925.0 discloses a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/20G01P3/38
Inventor 李秀智赵冠荣贾松敏秦宝岭
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products