Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Recycling and positioning method of underwater robot based on binocular vision

An underwater robot and binocular vision technology, which is applied in the direction of instruments, line-of-sight measurement, image analysis, etc., can solve the problems of underwater robot recovery work, image noise, color distortion, etc. Fast, high system frequency band, smooth output effect

Active Publication Date: 2022-05-03
JIANGSU UNIV OF SCI & TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In recent years, underwater light vision has achieved rich research results. However, due to interference factors such as dark light in the underwater environment and more suspended organisms, the collected images have serious noise and color distortion. It has a great impact on target positioning, thus affecting the tasks of underwater robots and the recovery of underwater robots

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Recycling and positioning method of underwater robot based on binocular vision
  • Recycling and positioning method of underwater robot based on binocular vision
  • Recycling and positioning method of underwater robot based on binocular vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The present invention will be further elaborated below in conjunction with the accompanying drawings of the description.

[0025] Such as Figures 1 to 6 As shown, the present invention is based on a binocular vision underwater robot recycling positioning method, specifically comprising the following steps:

[0026] Step 1: Use two underwater CCD cameras to shoot the calibration plate to obtain the parameters of the binocular camera, including internal and external parameter matrices, distortion coefficients, and rotation and translation matrices between the cameras;

[0027] Apply Zhang Zhengyou’s planar calibration method to calibrate the basic parameters of the camera. First, print a 7*10 black and white grid calibration board and take several calibration board images from different angles underwater; detect the feature points in the image to solve the ideal infinite The internal and external parameters of the camera in the case of distortion and use the maximum lik...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a binocular vision-based recovery positioning method for an underwater robot. Two underwater CCD cameras are used to photograph a calibration plate to obtain parameters of the binocular camera, including internal and external parameter matrices, distortion coefficients, and rotation between the cameras. , translation matrix; obtain the visual image captured by the underwater binocular camera as the input image to be analyzed; grayscale and binarize the input image to determine the connected domain in the image; match the light source and perform morphology on the underwater image The processing above obtains the coordinates of the center point of the final light source; calculates the relative position of the AUV and the docking dock. This method applies short-distance high-precision binocular vision positioning to the autonomous docking process of underwater AUV recovery, and uses the centroid detection algorithm and connected domain detection algorithm to replace the Hough circle detection method to improve the calculation of the relative position information between the AUV and the docking dock. The real-time performance improves the real-time performance and stability of positioning, and ensures the success rate of AUV docking.

Description

technical field [0001] The invention relates to a binocular vision-based recycling positioning method for an underwater robot, belonging to the technical field of underwater robot recycling. Background technique [0002] Autonomous Underwater Vehicle (AUV: Autonomous Underwater Vehicle) works unmanned and untethered in the marine environment. The recycling and reuse of AUV is one of the important research contents of AUV research and facilitation. In recent years, underwater light vision has achieved rich research results. However, due to interference factors such as dark light in the underwater environment and more suspended organisms, the collected images have serious noise and color distortion, which are difficult to describe underwater scenes. It has a great impact on target positioning, thus affecting the tasks of underwater robots and the recovery of underwater robots. [0003] Therefore, the underwater optical vision target detection and positioning system is studied...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C3/00G06T7/80
CPCG01C3/00G06T7/80
Inventor 朱志宇朱志鹏齐坤曾庆军戴晓强赵强
Owner JIANGSU UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products