Unlock instant, AI-driven research and patent intelligence for your innovation.

A monocular vision positioning method for underwater robots

An underwater robot, monocular vision technology, applied in the field of visual positioning, can solve problems such as difficult implementation and complicated calculation

Active Publication Date: 2020-06-16
NORTHWESTERN POLYTECHNICAL UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the method requires that the model of the robot be known precisely - which is difficult to achieve in most cases
Moreover, the calculation of this method is complicated, and it is not suitable for systems with high real-time requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A monocular vision positioning method for underwater robots
  • A monocular vision positioning method for underwater robots
  • A monocular vision positioning method for underwater robots

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0149] The definition of the coordinate system, the definition of each parameter and the definition of the positive direction all refer to figure 1 with figure 2 . It is known that PQ=2.0m, The pixel width of the camera is 800pixel. Depending on the specific experimental conditions, the above parameters may vary.

[0150] The fitting function of the magnification mag is:

[0151] mag=1.013+1.105×10 -2 dd+1.175×10 -2 |ψ|-7.832×10 -2 dd 2 -2.426×10 -2 dd|ψ|

Embodiment 1

[0153] Assume that in the image currently collected by the camera, P′=156pixel, Q′=315pixel, and the inertial device mounted on the robot measures the current heading angle ψ=5°. The coordinates of the robot in the global coordinate system are calculated below.

[0154] The first step: dd=0.411, calculated according to the fitting function to get mag=1.013;

[0155] Step 2: Calculate λ * =80.55;

[0156] The third step: calculate DC=4.97, AC=8.22;

[0157] Step 4: calculate AM=6.97, CM=4.36;

[0158] Step 5: calculate OC=2.92, OM=1.43;

[0159] Step 6: Calculate and get OA=7.12, θ=11.61°;

[0160] Step 7: At this time, ψ>0, and OC

[0161] Step 8: Calculate the coordinates of the robot in the global coordinate system as x=-6.97, y=1.43.

[0162] Error analysis: In this example, the real coordinates of the robot in the global coordinate system are (-7.0, 1.5), so the relative positioning error is (0.42%, 4.7%).

Embodiment 2

[0164] Assume that in the image currently collected by the camera, P′=251pixel, Q′=462, and the inertial device mounted on the robot measures the current heading angle ψ=10°. The coordinates of the robot in the global coordinate system are calculated below.

[0165] The first step: dd=0.109, calculated according to the fitting function to get mag=1.104;

[0166] Step 2: Calculate λ * =116.51;

[0167] The third step: calculate DC=3.43, AC=5.62;

[0168] Step 4: calculate AM=5.01, CM=2.55;

[0169] Step 5: calculate OC=3.06, OM=0.51;

[0170] Step 6: Calculate and get OA=5.03, θ=5.81°;

[0171] Step 7: At this time, ψ>0, and OC>CM, so there is no need to modify θ;

[0172] Step 8: Calculate the coordinates of the robot in the global coordinate system as x=-5.01, y=-0.51.

[0173] Error analysis: In this example, the real coordinates of the robot in the global coordinate system are (-5.0, -0.5), so the relative positioning error is (0.2%, 2%).

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular vision locating method applied to underwater robot. The locating method comprises the steps: firstly, distributing a marker, then optimizing and calculating magnification times mag according to collected images, obtaining a plurality of sample data by repeated collecting and establishing a functional relationship of the magnification times mag with a course angle psi and a proportion value dd; in actual application, calculating magnification times according to the course angle and the proportion value dd calculated out by the actual collected images and calculating out coordinates of the robot by a geometric method. Under the situation of guaranteeing higher locating precision, the method greatly reduces complexity of a monocular vision locating method and improves timeliness; thus, the method can be applied to a system without an accurate mathematic model or a system with a higher requirement on real timeliness or a hardware platform with low cost and limited processing performance.

Description

technical field [0001] The invention relates to the technical field of visual positioning, specifically a monocular visual positioning method for underwater robots, a simple method for monocular visual positioning in a two-dimensional plane, and is especially suitable for unmanned underwater vehicles (Autonomous Underwater Vehicle, AUV) and other underwater robots docking recovery work. Background technique [0002] Being able to determine its own position relative to a specific target is a necessary condition for an underwater robot to successfully perform its tasks. However, the complex underwater environment brings many difficulties to the precise positioning of the robot. Since GPS signals cannot be used for positioning, high-precision inertial navigation systems combined with underwater acoustic positioning systems are currently the commonly used positioning methods for underwater operations. The cost of this type of system is very high, and the underwater acoustic po...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/00G01C21/20G01C11/00G01S13/88
CPCG01C11/00G01C21/005G01C21/20G01S13/881
Inventor 高剑赵新元严卫生崔荣鑫张福斌
Owner NORTHWESTERN POLYTECHNICAL UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More