Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Underwater environment three-dimensional reconstruction method based on binocular vision

A technology for 3D reconstruction and underwater environment, which is applied in the field of 3D reconstruction of underwater environment and computer vision, and can solve problems such as not achieving ideal results

Active Publication Date: 2020-12-25
HARBIN ENG UNIV
View PDF9 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the 3D reconstruction of the underwater environment is mainly based on the theory and algorithm of 3D reconstruction on land, trying to improve the underwater image quality by making up for the lack of underwater light, introducing high distortion coefficients, and changing the focal length compensation, etc., and achieved good experimental results. , but there are certain errors in the actual operation, and the desired effect has not been achieved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Underwater environment three-dimensional reconstruction method based on binocular vision
  • Underwater environment three-dimensional reconstruction method based on binocular vision
  • Underwater environment three-dimensional reconstruction method based on binocular vision

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0175] Step 1: Collect and obtain underwater images, and use Zhang Zhengyou's calibration method to calibrate the binocular camera underwater to obtain the required parameters of the binocular camera. The main theoretical basis is formula (3), (4).

[0176] Step 2: Preprocessing the collected underwater images, including image denoising, image enhancement, image sharpening, image restoration, and underwater image defogging. Based on formulas (5), (7), (8), (12), (14).

[0177] Step 3: Perform feature detection on the preprocessed binocular image described in step 2, and perform stereo matching using the improved stereo matching algorithm fused by Census and NCC to obtain a disparity map containing depth information. Based on formula (15).

[0178] Step 4: Using the PCL 3D reconstruction method that introduces the moving least squares method, perform 3D reconstruction on the disparity map described in Step 3, and restore the underwater 3D environment in the image.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an underwater environment three-dimensional reconstruction method based on binocular vision, and the method comprises the following steps: 1, collecting and obtaining an underwater image, carrying out the underwater calibration of a binocular camera, and obtaining the needed related parameters of the binocular camera; 2, preprocessing the collected underwater image, including image denoising, image enhancement, image sharpening, image restoration and underwater image defogging; 3, performing feature detection on the preprocessed binocular image in the step 2, and performing stereo matching by using an improved Census and NCC fused stereo matching algorithm to obtain a disparity map containing depth information; and 4, performing three-dimensional reconstruction on the disparity map in the step 3 by using a PCL three-dimensional reconstruction method introducing a moving least square method, and restoring an underwater three-dimensional environment in the image. The invention introduces the moving least square method to solve the problems of point cloud discretization and point cloud vulnerability, thus visually reflecting a three-dimensional effect by a processing effect from multiple angles, and restoring the underwater three-dimensional environment.

Description

technical field [0001] The invention relates to an image three-dimensional reconstruction method, in particular to a binocular vision-based underwater environment three-dimensional reconstruction method, which belongs to the field of three-dimensional reconstruction of underwater environment and computer vision. Background technique [0002] After years of development, computer vision has matured and is currently used in underwater environments. Underwater robots equipped with computer vision functions can play an important role in various underwater tasks such as the detection and maintenance of submarine pipelines, the execution of marine search and rescue tasks, and the detection of submarine environments. In some submarine topography surveys and seabed resource development work, it is necessary to obtain more intuitive 3D information. Therefore, it is particularly important to reconstruct the underwater environment using 3D reconstruction technology based on binocular vi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T7/80G06T5/00
CPCG06T17/00G06T7/80G06T5/73G06T5/70
Inventor 赵新华景力涛王雪王越
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products